USAJOBS’ Analytics Success: using analytics to create accurate testing strategies.
Accurate testing strategies are crucial to ensure quality products. Hi-fidelity approaches ensure QA efforts are testing in a true-to-life manner, similar to real-world users. Inaccurate, lo-fidelity testing can miss situational bugs that become showstoppers in production. USAJOBS is leveraging the Digital Analytics Program (DAP) to form high-fidelity, accurate testing strategies that mimic production site-usage in the most accurate way possible.
Previously, USAJOBS testing was done on the devices, browsers, and operating systems available at the time. Testers were thorough, but, because they weren’t using true-to-life hardware and software, there were some gaps leaking through into production. The team was testing browsers, OS versions, and devices at a disproportionate rate to true-life site usage.
The team leveraged DAP to create a real-time dashboard to track multiple key performance indicators. The dashboard was a live-feed of the effort and was constantly displayed. Testers could see pages they were hitting, pageview activity, and device/browser/OS use. There was a retrospective meeting to compare test environment data against production data and, through this discussion, USAJOBS surfaced gaps between testing and production. From this, the team crafted a strategy to inform the next testing event, held about a month later for the next release.
In preparation for the next release, the testing lead briefed the team on inaccuracies between testing and production site-use. Knowing these discrepancies, the team tested the site in a more true-to-life form. The data reflects this. The charts below show OS data for two testing periods—a previous testing event on August 4th, and the most recent testing event on August 25th. Prior to data-driven testing, the team was over-representing Windows OS by a very large amount. Additionally, test efforts were under-testing iOS, Android, and Mac, all of which constitute a significant segment of the user population. The second testing was much more accurate to true-life site usage. Testing is now much closer to production usage in the second, data-driven testing effort. This trend can be seen along OS testing, browser testing, and device testing.
Through the implementation of DAP, the USAJOBS Program Office has improved its ability to test USAJOBS.gov. Testing is now more thorough and realistic, resulting in a better-tested site. The impact of this improved approach to the testing process is a reduced number of bugs identified post-release. DAP has allowed for a more thorough, rigorous testing strategy which has had a very positive impact on USAJOBS.gov.
Cory Benavente is an OPM PMO Team Member and Dywane Boyd is the Testing Coordinator in the OPM Office of the Chief Information Officer. This post is part of an ongoing series of case-studies highlighting how the federal government is using the Digital Analytics Program (DAP) to improve websites for the end-user. If you have an analytics success story to share, or want to get your federal government site participating in DAP, please contact us via email.