The Road to Better Websites Gets Easier with Usability Walkthroughs

May 21, 2014

The road to more user-friendly government websites does not have to be long and scary. In fact, there is a growing network of people and resources to guide you along the way. My office in the National Oceanic and Atmospheric Administration (NOAA) has been fortunate enough to benefit from some of this support, most recently in the form of a “usability walkthrough.”

Where the Road Begins

We were coming off the heels of having completely redesigned and relaunched our website, response.restoration.noaa.gov, in a new content management system in January of 2012. While it resulted in huge improvements, as with many websites, we began to get feedback from various sources that it could be better. Our first major step was conducting a user satisfaction survey. This helped us figure out who our users were, why they visited our website, and how easily (or not) they could find what they were seeking. Combining this information with our website analytics, we could start piecing together which areas to target for improvement. But this didn’t give us enough detail about exactly which pages and parts of our website needed this extra attention. Thanks to some of the webinars and best practices on usability hosted here on DigitalGov, we began thinking about doing some user testing. At this point, however, “user testing” was still a scary, foreign concept best left to the professionals.

Teaching Us to Fish

In early 2014, we called on the professional usability experts at the DigitalGov User Experience Program at the U.S. General Services Administration (GSA). While their focus was changing from primarily doing usability testing themselves to teaching others to do it, they offered to help ease us into the process by performing a “usability walkthrough” of our website. Our usability walkthrough involved several experts from GSA, myself (representing our Web team), and one hour of live feedback as the experts tried to use our website to do specific tasks. Beforehand, we had identified a range of “top tasks” our users need to accomplish while on our website. These tasks ranged from being able to download key software programs to finding out when one of our popular trainings is next being offered. For each task, we created a typical user scenario and the multiple ways we expected someone to navigate our website to do it.

A Digital Walk

During the hour of live feedback, one expert who had never before seen our website walked through our top tasks one by one, checking them off as she thought she had accomplished them. Watching remotely, I could see her mouse travel across our website, clicking from page to page, and I could hear her thinking aloud as she reasoned why she navigated through our website the way she did. I silently cheered as she correctly found certain pages or programs and bit my tongue, scribbling notes, as she sometimes missed certain navigation and features meant to direct her to her goals. Next, another expert shared feedback from her experience trying to do the same thing at an earlier date. Together, they also gave general feedback on the overall site, both what we were doing right and what we could do better. One major, but easy-to-implement, recommendation was to cut back on text and move important calls to action (e.g., download links) higher up on the screen into the user’s initial field of view. After the usability walkthrough, the team gave us a report summarizing their feedback, recommending top usability improvements, and pointing us to other resources for learning more. In addition, they recorded the session and sent us a link to the video so others on my team could learn from the experience.

Making Things Happen

Based on these suggestions, our Web team reviewed some of our most visited pages. In some cases, we realized we were missing obvious calls to action, such as links to download our software, on the software product’s main page. In another example, one of the most downloaded files on our website is an educational game called the “Water Cycle Game.” Most of the traffic to this page comes from search, but the game’s files were pushed low on the page beneath big paragraphs of text giving background information. To address this, we shortened the first paragraph, moved up the call to download the game, and added sub-headings and bullets to make the text easier to scan (see red boxes below).

Another key area the team targeted for us to improve was a page pointing to our major software, publications, and other resources for professionals who work in oil and chemical spill response. It was dense with links and information and lacked the white space needed to scan quickly. This is what it looked like before:

We came up with six potential layouts to address these issues. Putting these options before our broader Web team, we noted people’s feedback on each layout for this page. We ultimately settled on a layout that breaks the single page into two (separating oil spill from chemical spill response tools), organizes the resources by type (software vs. publications), and uses bullets to add spacing between items listed. See the results below:

While we’ve tackled a lot already, our team still has more work ahead. Currently, we are adapting our website to a responsive design, which will make our site easier to view on mobile devices. During this process, we are incorporating additional feedback from the usability walkthrough, particularly the suggestions directed at our homepage. And thanks to a little help from our friends in user experience, our journey to a more user-friendly website has started looking more like a walk in the park.Ashley Braun is a Web Editor at NOAA’s Office of Response and Restoration. For more information on usability testing, visit the DigitalGov User Experience Program page or join the DigitalGov User Experience Community of Practice.