At the U.S. Agency for International Development (USAID), our new open data policy will begin making more Agency-funded data broadly accessible to the public. It completely changes the way we do business, and it also means that in the coming years, the amount of data we host on our open data website (known as the Development Data Library) will dramatically increase. So the question is: when we’re done overhauling our website, how will the user make sense of all that information to find exactly what they’re looking for?
Figure Out the Essential Features and Save the Flashy Bits for Later
It’s easy to want to keep adding features and make a website that’s flashy and shiny, but more features means more complexity – which may make the site harder to use. Moreover, we don’t want to waste our time building a product that few people will actually use.
To help figure out exactly what users need in an open data website, we conducted a usability study. This consisted of gathering a group of ten USAID employees and stakeholders, virtually and in-person, to examine our current site and a mockup for a next-generation site. We asked them to do a series of common tasks, like searching for datasets by topic, project name, or geographic region, as well as trying to understand what the information within a dataset actually contains. As they were playing with the website, we observed their interactions and followed it up with a discussion about their experience, trying to understand the pain points.
Take the Evidence and Prioritize Next Steps
Some of the insights—such as how users want to be able to both search for very specific data as well as explore based on topic—we had already highlighted as being important. But the usability study helped build the evidence base for these decisions and provided data about exactly how important these features are—rather than relying on our gut feeling. Moreover, we now have a greater understanding of exactly how users want to look for information on the site. For instance, it became abundantly clear that users want to preview data to see whether the information is relevant to them, before they commit to downloading the data or going to an external site to learn more about the dataset. That changes our priorities in adding features to the new site.
Iterate, Iterate, Iterate
As we were evaluating our results, we wanted to highlight a few lessons we learned in designing a usability study:
- Observe—don’t help! As someone with intimate knowledge of how the site is “supposed” to work, it’s hard to sit back and watch people struggle— particularly when they ask you direct questions. But deferring those questions until the wrap-up discussion is worth it. For instance, we saw all the workarounds people tried to use when their preferred method didn’t work. Also, be aware of your own biases: make sure that your next steps are based in users’ feedback, not your own prejudices. For that reason, it’s useful to have more than one notetaker, including someone with limited experience with the site. They’ll catch comments the experts will miss.
- Preparation pays off. Before the event, we did a demo run with a couple of colleagues, which was extremely helpful. Since I’d never run a usability study, I was unsure whether the tasks we wrote were appropriate, clearly phrased, or allotted the right amount of time. In the dry run, we were able to iron out those kinks.
- In-person is better, but virtual is better than nothing. Many of our participants completed the tasks at their desk, using a Google form to record their comments, and then participated in a conference call discussion afterwards since it was less disruptive to their day. While we got more information from observing people who navigated the site in-person, the virtual discussion was still helpful and allowed more people to participate.
- The GSA team is extremely helpful—use them. A huge thanks to Jonathan Rubin at GSA for his support and for providing us with a space at GSA and reliable wireless Internet access. They go out of their way to promote best practices in usability, so take advantage of their resources and enthusiasm.
Our study was small, and it would be great to go deeper and get more insight from members of the public who aren’t as familiar with USAID’s programs. But it’s a start—and design should be iterative. We look forward to showing you a new and updated site in the future that incorporates the feedback from this usability study. And then the process starts all over again, as we test out how the new site could be further improved.
Laura Hughes is an Open Data Specialist at USAID. She’s happy to tell you more about exactly how she ran the study or discuss other digital topics via email.
Have feedback or questions? Send us an email at firstname.lastname@example.org