Getting to Know Your Users: Tips and Tricks from Veterans Affairs
Design research isn’t rocket science. But for many of us in the federal government, it can seem daunting and unfamiliar. We’re here to to help demystify the process of design research for those of you ready to wade into the waters.
We’ve both done our fair share of design researching at the Department of Veterans Affairs (VA) over the past year. It hasn’t been easy—we’ve worked under itty bitty budgets and crazy timelines. Many doubted the value of the work or hadn’t been exposed to the design research approach before. Many equated research with focus groups, or demographic breakdowns of user data. But the need to bring rich insight into veterans’ experiences with VA was strong—so getting through the hurdles was worthwhile.
1. Getting ready: Figure out what you’re trying to figure out.
You’ll quickly find yourself overwhelmed if you try to answer everything you might want to know. What’s the big ‘what’ you’re trying to uncover? If you feel overwhelmed by this, start by brain dumping all the kinds of questions you might ask someone…then pull back and try to organize those questions into categories. Chances are you’ve got a couple of big themes right there staring back at you.
What we did at VA:
VA wanted to better understand their customers’ perceptions of how well VA services were fitting into their lives and meeting their needs and expectations. During the first phase of our research, we outlined two goals:
- To test the usefulness and application of a human-centered design (HCD) methodology within the context of VA.
- To better understand veterans’ experiences interacting with VA, identify pain points in the present-day service delivery model, and explore opportunities to transform these interactions into a more veteran-centered experience.
Six months later, when we wanted to build on the first research sprint, we continued to pursue the first initial goal, and broadened the second:
- To validate and expand upon previous HCD efforts.
- To create a deep understanding of VA’s customers.
2. Not-so-sample size: You don’t need to talk to everyone.
Human-centered design research intentionally surfaces ‘thick’ data over ‘big data,’ which is to say: it takes small numbers of users (between 20 and 100) and delves deeply into their human needs, desires, motivations, and behaviors. Whereas quantitative studies often draw from large sample sizes to uncover statistically significant data, and focus groups and online feedback forms gather customer opinions, ethnography and design research are intended to complement this information with a deeper understanding of the nuanced and complex lives of customers.
A quick note: why 1:1 in-depth interviews vs. focus groups? Because you learn different things in one-on-one interviews. Think of talking to a high schooler about picking a college while sitting next to their parent, or a group of peers, versus having this conversation with them, by themselves. The conversation will be different.
People are influenced by many cognitive biases. In focus groups, we’re particularly influenced by social desirability bias—the tendency to over-report what we think people want to hear, and under-report what we think people don’t. The more direct route to get to an understanding of what people are really thinking and feeling is to speak with them individually, or on occasion, with a trusted family member or close friend if they are particularly reticent to speak 1:1.
What we did at VA:
We reached out to female and male veterans from all branches and eras of service, from various socio-economic levels, and in urban, suburban and rural environments across the United States. We were looking for as much diversity as we could find.
3. Finding users.
Which is to say: look for the buckets of people you’re interested in connecting with. Go to schools. Go to churches. Go to libraries. Go to community centers. Reach out through networks of people that share an affinity with your research goals. Create a screener for asking a series of questions to validate that the potential interviewees’ experiences will lend themselves to your research goals (a Google form works well if you’re reaching out digitally). When setting up an interview schedule, allow 60 to 90 minutes for each interview, and 30 minutes time in between each one to de-brief and capture high-level themes from your notes while they’re fresh in your mind.
What we did at VA:
We know that veterans build and maintain strong networks with their fellow service members both during and after service. We went through trusted networks of service members and community organizations that support service members to do our recruiting. While it’s common to offer incentives to research participants in the private sector, as government employees we don’t have that option. That’s ok. We found plenty of people who were willing and happy to talk with us. They were glad VA wanted to understand them better. For both research sprints, we had a combination of scheduled interviews plus those we picked up on the fly. Spontaneity happens. And if you can accommodate it, do.
4. Interviewing: Ask open-ended questions, and do lots of listening.
Create an interview guide. Chances are you’ll reference it heavily for the first few interviews in any research sprint, and as you get into the swing of things, you’ll use it less and less. That’s good. But taking time up-front to think through the questions that address your goals is a helpful framework. Asking open-ended questions is important, because it leaves people room to answer in myriad ways, and tell you why they feel what they feel. Consider doing a version of the 5 Why’s to go deeper into people’s feelings and perceptions. Listen deeply. Don’t be afraid of a moment of silence. Leaving a pause in the conversation often leads to further sharing from your interviewee.
What we did at VA:
A pair of researchers sat down with each interviewee in their home, their workplace, or a public setting in which they felt comfortable meeting. One of the researchers set the context by explaining what we were seeking to learn, had each participant sign a release form, and then asked the interviewee a series of open-ended questions. While we had developed a script beforehand, we followed the conversation where it went. The researcher asking the questions maintained full focus on the interviewee, while the second researcher took notes by hand, capturing as much of the conversation, verbatim, as they could. They were listening for key quotes that hinted at themes. It’s common to audio tape interviews, transcribe them and listen back for things that may have been missed in the moment. This is a great practice, but one we knew our timelines wouldn’t allow. We had to be especially vigilant in the moment to capture as much as possible.
5. Making sense of it all: Buy a lot of Post-It notes and dig for patterns.
The process of analyzing your research findings, often called the ‘synthesis’ phase, is more art than science. Synthesis often takes place in two parts: continuously in the field, analyzing and organizing data throughout the investigation, and then with a multi-week analysis period at the end of the research stage. This work consists of abstracting interesting insights away from individual users and looking for patterns and commonalities across different users’ experiences. Using the raw data from your interview notes, have each person on your team comb through them looking for big themes, sub-themes and quotes that exemplify each. Write one per sticky, get yourself to a large wall and start posting—moving things around as you find patterns that uncover previously hidden relationships. Use your analytical and your intuitive smarts as you create an aggregate picture of what you’ve learned.
What we did at VA:
We did nightly synthesis on the road, from restaurant lounges to hotel rooms. As we created our collages of sticky notes, we spent time sharing stories, asking questions and beginning to generate initial hypotheses. We followed this up with weeks of a fuller synthesis once we were off the road. Along the way, we experimented with different offline and online methods to capture what we were learning as we went. We carried our paper ‘stickie collages’ from city to city. We experimented with snapping photos of our nightly stickie collages with the Post-It Plus app. And we mapped each evening’s notes into an Excel file organized by date, location, theme, sub-theme, quote, interviewee (first name only), branch of service and era of service. Once we got home, we re-arranged all the stickies into new collages of themes and sub-themes. Going through the raw data so many times increased our confidence in the validity of our insights.
6. Bringing it home: Craft the story and find the right format.
At the end of the day, you are a designer. (Yes you, all of you). You have the power—and responsibility—to create something meaningful, engaging and useful out of this sea of information you’ve just amassed. Your stakeholders—fancy bosses and busy agency partners and stretched-thin product managers—don’t need all of the beautiful insights you’ve gleaned. They need a story about what you learned. You need to take the themes you pulled out of the research and build a narrative. At this point, you’ll start to think about the shape your story will take. Are you sharing specific insights about trends in your user groups? Then perhaps personas—or archetypes of your customers—will help you. Or are you trying to communicate painpoints in a user’s path through your service? A customer journey map might do the trick. These are just a few of the formats for capturing the insights.
What we did at VA:
We created research reports that were visually-design driven. We didn’t want to contribute to the overpopulation of PowerPoint documents in government. We did want enough ink to tell our stories in a way that captured people’s emotions and imaginations and stood on its own. So, we created narrative arcs for the documents: we shared the human-centered design methodology, key insights (themes, sub-themes and relevant quotes), portraits and a journey map of the present-day service experiences, personas, opportunities and ‘how might we’ questions [PDF]. The work has been enthusiastically embraced by the VA Administration in more profound ways than we imagined and is playing a part in driving positive organizational change. More research, here we come!
Design research at the Department of Veterans Affairs:
Usability.gov’s user research methods
IDEO’s HCD toolkit
Jan Chipchase’s Research Methods
Erica Hall’s Just Enough Research
Steve Portigal’s Interviewing UsersSarah Brooks is a service designer in the Veterans Experience team at VA and Presidential Innovation Fellow. Mollie Ruskin is a service designer at USDS. She served as a Presidential Innovation Fellow at VA in 2013-2014.