This blog post is adapted from a talk at the 2018 Government UX Summit at the Bureau of Labor Statistics.
There are many ways to adapt agile to work with user experience (UX) – but it’s not always clear just how to tweak the process. That’s where experiments come in. Our team of designers and developers at the Federal Reserve Board (FRB) have been subtly adjusting the agile process to work better for our team, and we think others can learn from our experimentation.
To understand our experience, it’s also important to share context about our team. It is made up of three designers and three developers, and we all work in-house. Typically, we work on one project at a time, but we also spend time on maintenance of projects we’ve already built. Because our team is half designers, we deeply care about the process of design, but we also want to work in a way that is fluid, flexible, and focusing on building as soon as possible.
We don’t consider ourselves experts, but we want to share what we’ve tried, what’s failed spectacularly, and what’s helped us get more in sync.
Experiment #1: Incorporating Sprint Zero
Throughout our projects, we’ve felt a tension between the design process and a pressure to move fast. As a design-driven team, we didn’t want to shortchange our research, but we also knew the developers needed more tasks to tackle. These goals resulted in incredibly ambitious and overreaching early sprints.
We’ve tried a few variations on “sprint zero.” While agile experts will describe sprint zero in various ways, our sprint zero is meant to help designers get a head start on the design process. We find that sprint zero can be a time for designers to work on scoping the project, getting initial research done and help to create a backlog of tasks and user stories before bringing developers full-time on a project. Sprint zero also benefits developers because it gives them time to work on a small, straightforward projects or infrastructure tasks in the backlog.
While we’re still evaluating the sprint zero concept for our team, we already have some lessons learned.
We know that tension can be healthy. The pull of the agile process on design means that we need to ensure that our research phase isn’t too sprawling. We need to continually ask ourselves what type of research the project truly requires—a full-fledged set of insights, buy-in from stakeholders, something else entirely?
Like any agile team, we’re also striving to prototype as quickly as possible. As designers, it’s important to prioritize tasks that we can hand off to developers as soon as possible. If we know we have a feature for the developers to work on, we can hand that off while we’re still figuring out the rest of the system. In other words, we’re striving to do just enough research to start designing, and just enough design to start building.
Experiment #2: Bringing Devs into the UX Process
We were surprised to learn that our developers didn’t realize a traditional approach to agile has some conflicts with the design process. We knew that we needed to figure out how to better involve developers in the design process so that they could better understand our work and its relevance to their roles and the products we deliver. We tried a variety of approaches that might work for your team, too.
Designers can encourage developers to take part by incorporating design tasks that put them in a supporting, observational role during the research phase of a project. Depending on interest and preferences of your teammates, this role can expand and change. By doing this, we realized that the more we understand each other’s roles, the better we’re prepared to deal with unknowns. We know that life happens, and if a designer is unexpectedly out of the office, we can always count on a developer to help support design needs. Alternatively, sometimes a project is really design heavy, and a dev needs to be involved early to understand context, see the big picture and communicate with stakeholders.
This experience was also a time to practice what we preach. As designers, we preach empathy, but how often do we practice it on our own team? Seeing a teammate experience something as a beginner has increased our empathy and built understanding, for not just users who we’re testing or designing with, but our own team.
If a sprint zero won’t work for your team, involving developers in the design process is also a good alternative.
Experiment #3: Skipping Estimation
Most sprints start with a rote estimation of tasks. Our team was using a Fibonacci sequence to help estimate tasks, but we started to question its value. The numbers sometimes felt arbitrary and we wrestled with the notion that the numbers are not hours. We also noticed that our design tasks took up a similar amount of time for each project, and it felt like a waste of time to estimate standard tasks, like writing a research guide and conducting individual interviews. So, we decided to eliminate estimation for a few sprints—and we got a lot of mixed results.
Instead of estimating, our team created goals for each sprint, and we relied more heavily on achieving an overall goal, whether for the team or an individual owning a goal or a particular subset of a goal. Our sprint goals helped keep things in check. Our goals helped us ensure our expectations for the sprint were reasonable and achievable and that our tasks supported the goals we wanted to achieve. We found that creating goals for the sprint kept our team motivated, while also reducing the tendency to slip back into a waterfall approach.
While it felt good as a team to do away with estimation and try something different, we realized that estimation serves an important purpose. By not estimating, we lost some of the transparency that comes with fully talking through the tasks on and ensuring they’re being worked on every day. We found that estimation can serve as a signal of when it’s time to ask for help and partner up. When we weren’t estimating, it was more difficult to identify if a task was stuck and might benefit from another approach or perspective. Estimating as a team creates clarity because it establishes a shared understanding.
This experiment was useful, and we’re continuing to rethink how we estimate. We’re currently trying to estimate our tasks as small, medium, large or extra-large. We’re hoping this version might simplify the estimation step and still achieve the transparency that we missed. No matter what we try, our goal isn’t perfect estimates. We’re aiming for common understanding about what’s on people’s plates and guarding against tasks that are too big and inhibit progress.
Experiment #4: Actually Incorporating Retrospective Learnings
We spent time as a team doing thoughtful and honest retrospectives, but often our insights sat in a document and never got incorporated into our next sprint. We asked ourselves how we could prevent our learnings from getting lost.
We decided to try a few things, starting with incorporating one new experiment each sprint. We looked through our past retrospectives to pinpoint where we had already identified opportunities. We brainstormed to figure out how to address our weaknesses with a new experiment. It was a positive and proactive approach the entire team could buy into.
Whether or not we decided to experiment based on a past retrospective, we started reviewing our old retrospectives during a sprint close. Reviewing our retrospectives more regularly helped us understand the difference between slight annoyances during a particular sprint, versus an ongoing pattern we needed to address.
We also incorporated thanking each other. In addition to reflecting on the previous sprint, we wrapped up our final retrospective of the process by thanking one person for something they did on the project that was meaningful. By making learnings personal, they also become more memorable.
We are interested in hearing the challenges you have had in incorporating agile and UX into your process. What experiments have you tried? What has worked well and what has failed? Use the email link below to let us know. If you have a .gov or .mil email address, you can also join the User Experience Community of Practice.
Have feedback or questions? Send us an email »