You did it! You moved your company away from the old way of managing vendor tags and launched a shiny new Tag Management System (TMS). By all accounts, you are the IT department hero of the day -- rapidly creating data layer
elements, crafting load rules, and manipulating data before it reaches your analytics tools. You are the master of your web application vendor tag domain.
Don’t get too excited, though. There may be trouble in paradise.
Let’s say one of the data layer elements you defined is not receiving values as expected. Maybe the data source was defined incorrectly, maybe the data is being hijacked and overwritten by some other bit of code, or maybe the code monkeys on the third floor changed the user interface and forgot to tell you (again). How can we prevent these kinds of failures in the deployment of our TMS code? And how can we be certain that our masterfully created solution will work as expected?
Five Flavors But Still No Mint Chocolate Chip
An application can break in many different ways. A button may not open in a new window as expected. A new feature may use a library that is incompatible with the existing features. A poorly written data sort in the application could slow it to a crawl. A security hole could be exposed accidentally allowing anyone access. The list goes on.
When facing these challenges, testing is typically broken up into five categories:
1. Functional tests verify that a particular feature works as expected
2. Compatibility tests verify that the application is compatible with various operating systems, libraries, and devices
3. Performance tests verify that the application is fast and executes in the expected amount of time
4. Security tests verify that the application is secure from intrusion
5. Usability tests verify that the application's user interface is easy to use and understand
for all of these types of errors can be overwhelming. With a large and complex application, your suite of tests can rapidly balloon into a lumbering beast. The kind of fire breathing, damsel stealing beast which makes the peasants (coders) run for the hills.
So, what about manual software testing, you ask? Don’t even think about it. No one wants to slog through a giant spreadsheet testing that this specific button acts this specific way in this specific situation. There is also no guarantee that the fifteen steps needed to test our theoretical button will be executed exactly the same way every time the test is run. Worst of all, if any tiny piece of code is changed in the future, all of our tests must be manually performed again. For these reasons, tests are usually the first item in software development to be dropped.
Automated Deployment Testing to the Rescue!
Imagine a world where you deploy your TMS code to a production environment and you are automagically given the results of your test suite. You sleep well at night knowing that your TMS is pushing bits and bytes around exactly as expected. Your portfolio of analytics tools are perfectly analyzing your web presence. You are given the Nobel Prize for Analytics. Well, maybe not that last one, but that world isn't as far away as you think.
So, let's create a simple example test to demonstrate how automated deployment tests can help you by using a few pieces of technology:
Ubuntu Linux: Linux operating system and distribution for personal computers, smartphones, and network servers
JMeter: An application designed to load test functional behavior and measure performance
Event Notify Test Runner: A utility for running arbitrary commands when files change
Mutt E-Mail Client: A Command Line Email Client
Now, let's create the workflow for our automated TMS deployment testing:
Monitor for TMS changes
Launch the test suite
Record the test suite results
Send the test suite results
View the test suite results
Creating a suite of tests in JMeter (or whatever testing tool you choose) is outside the scope of this example. For this example, let's assume a single test which checks if a website responds with an HTTP status of “200 OK.” Ebiquity is, of course, here to help you create a comprehensive test suite which will give you all the info you need to make actionable marketing decisions.
Here’s how you should follow through with each step, including code examples:
1. Monitor for TMS changes- We want to know when a new version of TMS code is deployed to our production environment. The quick and dirty way to achieve this is by monitoring the file system on our Ubuntu web server. When the TMS universal tag file is changed, we launch our test suite.
$ ls some-tag.js | entr some-test-runner
2. Launch the test suite- JMeter can be run in command line mode by passing in a test suite to execute.
$ ls sometag.js | entr jmeter -n -t tests.jmx
3. Record the test suite results- We tell JMeter where we want our test results to be saved.
$ ls sometag.js | entr jmeter -n -t tests.jmx -l results.jtl
4. Send the test suite results- Create another monitor for the test suite results file. When the file changes, we send the file as an email attachment.
$ ls results.jtl | entr mutt -a results.jtl -s "Test Suite Results" -- email@example.com
5. View the test suite results- JMeter comes with a handy dandy graphical user interface (GUI) which lets us view the test results as graphs or trees or grids or whatever we desire. Firing up the GUI and analyzing the test results is again outside of this tutorial's scope. And again, our team can help you develop a test results analysis based upon your specific business and project needs.
Make it Happen
Once you have a basic automated deployment testing routine configured and running, the sky is the limit. You could move your test suite to the cloud and run your tests from different locations around the world. You could set up a feedback mechanism which notifies developers when a test fails. You could even create another feedback mechanism which notifies your operations team when a performance test fails. The possibilities of automated TMS deployment testing are truly endless, so get on the road to that Analytics Nobel Peace Prize and take advantage of it!
Do you use automated TMS deployment testing? For more information, contact us at firstname.lastname@example.org