This is a guest post by Jon Innes of UX Innovation LLC.
Those of us working in agile teams have come to appreciate many of the advantages of the agile way:
- Capturing simple requirements in the form of user stories
- Working in short iterations to create small but useful improvements in functionality
- Collaborating closely in cross-disciplinary teams
However, one waterfall practice that seems to persist even in the most enlightened agile teams is the idea that testing with users is something that only happens at the end.
Finally, this seems to be changing. UX practitioners have been popularising the idea that evolving the design of your product and learning about your target market should also be approached in an agile fashion. This idea, recently popularised by Eric Ries, in his bestselling book The Lean Startup, has spread and now many teams have realised that just applying agile methods to development is insufficient. You need to treat the greater effort of creating value for customers as an experiment and apply the same iterative process to refine your ideas into a successful product.
Often when applying agile to product design, teams often hit an obstacle, how to collect user feedback on an agile timeline. Think about how an agile team approaches traditional QA. The most successful agile developers leverage a technique known as test driven development or TDD. Rather than starting by writing a lot of complex code, they start first by writing simple tests, using tools specifically designed for TDD like Cucumber or RSpec.
The idea is that by thinking about new functionality in a testable way first, and writing tests that capture the small details as they work, developers end up writing much better code. They also enjoy the benefit of having captured the tests early in an automated fashion, freeing themselves to focus on coding new functionality instead of spending a lot of time manually testing and retesting everything after every minor change. This allows them to stay in a state of creative flow and become much more efficient, or as we say in agile circles lean.
Now you might ask, what has this got to do with UI design work? Well the same problem exists when thinking about UX. When designing the core functionality, it’s important to think about what defines the minimum viable product and how you’re going to test it. What are the key things users want from it? What are the things you absolutely can’t take away before it’s no longer valuable to them? These are the things you want to learn about your designs. Not just at the very end, but as you iterate, as often as possible. In an automated way that allows you to focus on the new work while still making sure what you are adding isn’t breaking anything you did previously.
Most of you might be thinking that the answer lies in A/B testing, but you’d only be partially right. The problem is most web analytics tools aren’t designed for this purpose in mind. They make it easy to capture gross metrics, often referred to as vanity metrics, like number of users hitting a page. Some teams try using these tools for user research but doing so is much like using traditional QA tools for TDD. Most web analytics tools were designed for advertising not for product design. If you started your agile process by defining user stories and designed your site to support those stories, wouldn’t it make sense to use a tool specifically designed to test those stories and what users experience when doing them?
The good news is that such tools actually exist. Even better is this isn’t a tool like RSpec, that requires you to learn a programming language, figure out how to install it, and configure it to get started. You can use remote UX research tools, like UserZoom. UserZoom is a software solution for remote and cost-effective UX research, designed for UX work by UX experts. It’s the right tool for the job, or more correctly “service” for the job. As a service, it takes care of the details so you can focus on designing a great experience for your users. You enter your user stories as tasks, and other related information into easy to use web forms, and UserZoom runs the studies in an automated, repeatable, scalable fashion. UserZoom will automatically run studies on your existing or new functionality on a regular basis, so you and your team can focus on improving the results. That means you can focus on refining your minimum viable product or finding other new ways to add value.
By automating your user testing with UserZoom, you not only save time by avoiding all the logistics associated with traditional usability testing, you retain most of the advantages of a traditional usability test. These solutions make it easy to do studies on quick and dirty prototypes or mockups. Their flexible service allows you to capture user feedback in both freeform and structured formats in task specific surveys if you so desire. That means you can now fit many types of traditional user research into the short timelines of agile sprints with very little effort. This makes every sprint more productive, as you can continuously validate both existing and new designs, even when all you have is a wireframe or a set of tab names.
Since remote UX testing allows you to collect targeted UX data, it’s not just more precise, it’s more actionable. Most web analytics tools don’t let you analyse task flows easily. Nor do they let you easily manipulate one of the most important variables, the type of user (market segment) in your studies. That last one is important if your goal is to study the needs of users who currently don’t visit your site. UserZoom, for example, also makes it easy to do competitive UX benchmarking, so you can objectively determine how your site compares to your competitors.
Smart companies have always done UX benchmark testing of their products. They regularly compare usability metrics against baseline metrics to make sure the basics still work for their existing customers, immunising against feature creep. They also know that tracking the impact of design changes helps them focus on executing on the strategy of creating a minimum viable product for existing users, or enabling them to more effectively prioritise functionality for new users. That includes having objective data on where competitors are gaining an edge on them. The next time your team grooms your backlog, ask how you are going to test if your stories are done before you start, because doing otherwise won’t help you learn faster, it will just slow you down.