Tuesday, January 18, 2011

Case Study: UX, Design, and Food on the Table

(One of the common questions I hear is how to reconcile design and user experience (UX) methods with the Lean Startup. To answer, I asked one of my favorite designers to write a case study illustrating the way they work, taking us step-by-step through a real life redesign.

This is something of an IMVU reunion. The attendees at sllconf 2010 were wowed by Food on the Table's presentation. If you weren't there, be sure to watch the video. Manuel Rosso was IMVU's first VP of Marketing, and is now CEO of Food on the Table, one of the leading lean startups in Austin. I first met Laura Klein when we had the good fortune of hiring her at IMVU to join our interaction design team. Since then, she's gone on to become one of the leading experts implementing UX and design in lean startups. 

In this case study, Laura takes us inside the design process in a real live startup. I hope you'll find it illuminating. -Eric)

A lot of people ask me whether design fits into the lean startup process. They're concerned that if they do any research or design up front that they will end up in a waterfall environment.

This is simply not true. Even the leanest of startups can benefit from design and user research. The following is a great example of how they can work together.

A couple of months ago, Manuel Rosso, the CEO of Food on the Table came to me with a problem. He had a product with a great value proposition and thousands of passionate customers. That wasn't the problem. The problem was activation.

As a bit of background, Food on the Table helps people plan meals for their families around what is on sale in their local grocery stores. The team defined an activated user as someone who made it through all the steps of the first time user experience: selecting a grocery store, indicating food preferences, picking recipes, and printing a grocery list.

Users who made it through activation loved the product, but too many first time users were getting lost and never getting all the way to the end.

Identifying The Problem

More than any startup I've worked with, Food on the Table embraces the lean startup methodology. They release early and often. They get tons of feedback from their users. And, most importantly, they measure and a/b test absolutely everything.

Because of their dedication to metrics, they knew all the details of their registration funnel and subsequent user journey. This meant that they knew exactly how many people weren't finishing activation, and they knew that number was higher than they wanted.

Unfortunately, they fell into a trap that far too many startups fall into at some point: they tried to measure their way out of the problem. They would look at a metric, spot a problem, come up with an idea for how to fix it, release a change, and test it. But the needle wasn't moving.

After a couple of months, Manuel had a realization. The team had always been dedicated to listening to users. But as they added new features, their conversations with users had changed - they became more narrowly focused on new features and whether each individual change was usable and useful. Somewhere along the way, they'd stopped observing the entire user experience, from end to end. This didn't last very long - maybe a month or two, but it was long enough to cause problems.

As soon as he realized what had happened, Manuel went back to talking directly to users about their overall experiences rather than just doing targeted usability tests, and within a few hours he knew what had gone wrong. Even though the new features were great in isolation, they were making the overall interface too complicated. New users were simply getting lost on their way to activation.

Now that they knew generally why they were having the problem, Manuel decided he needed a designer to identify the exact pain points and come up with a way to simplify the interface without losing any of the features.

Key Takeaways:
  • Don't try to measure your way out of a problem. Metrics do a great job of telling you what your problem is, but only listening to and observing your users can tell you why they're having trouble.
  • When you're moving fast enough, a product can become confusing in a surprisingly short amount of time. Make sure you're regularly observing the user experience.
  • Adding a new feature can be useful, but it can also clutter up an interface. Good design helps you offer more functionality with less complexity.
Getting an Overview of the Product

When I first came on board, the team had several different experiments going, including a couple of different competing flows. I needed to get a quick overview of the entire user experience in order to understand what was working and what wasn't.

Of course, the best way to do that is to watch new and current customers use the product. In the old days, I would have recruited test participants, brought them into an office, and run usability sessions. It would have taken a couple of weeks.

Not anymore! I scheduled UserTesting.com sessions, making sure that I got participants in all the main branches of the experiments. Within a few hours, I had a dozen 15 minute videos of people using the product. The entire process, including analysis, took about one full day.

Meanwhile, we set up several remote sessions with current users and used GoToMeeting to run fast observational sessions in order to understand the experience of active users. That took another day.

Key Takeaway: Get feedback fast. Online tools like GoToMeeting and UserTesting.com (and about a hundred others) can help you understand the real user experience quickly and cheaply.

Low Hanging Fruit

Once we had a good idea of the major pain points, we decided to split the design changes into two parts: fixing low hanging fruit and making larger, structural changes to the flow. Obviously, we weren't going to let engineering sit around on their hands while we made major design changes.

The most important reason to do this was that some of the biggest problems for users were easy to fix technically and could be accomplished with almost no design input whatsoever.
For example, in one unsuccessful branch of a test, users saw a button that would allow them to add a recipe to a meal plan. When user test participants within the office pressed the button, it would very quickly add the recipe to the meal plan, and users had no problem understanding it. When we observed users pressing the button on their own computers with normal home broadband connections, the button took a few seconds to register the click.

Of course, this meant that users would click the button over and over, since they were getting no feedback. When the script returned, the user would often have added the recipe to their meal plan several times, which wasn't what they meant to do.

This was, by all accounts, a bad user experience. Why wasn't it caught earlier?

Well, as is the case with all software companies, the computers and bandwidth in the office were much better than the typical user's setup, so nobody saw the problem until we watched actual users in their natural environments.

What was the fix? We put in a "wait" spinner and disabled the button while the script was processing. It took literally minutes to implement and delivered a statistically significant improvement in the performance of that branch of the experiment.

Giving immediate feedback drastically reduced user error
Manuel told me that, immediately after that experience, the team added a very old, slow computer to the office and recently caught a nasty problem that could add 40 seconds to page load times. Needless to say, all usability testing within the office is now done on the slowest machine.

Key Takeaways:
  • Sometimes big user problems don't require big solutions.
  • To truly understand what your user is experiencing, you have to understand the user's environment.
  • Sometimes an entire branch of an experiment can be killed by one tiny bug. If your metrics are surprising, do some qualitative research to figure out why!
A Redesign

While the engineering team worked on the low-hanging fruit, we started the redesign. But we didn't just chuck everything out. We started from the current design and iterated. We identified a few critical areas that were making the experience confusing and fixed those.

For example, we started with the observation that people were doing ok for the first couple of screens, but then they were getting confused about what they were supposed to do next. A simple "Step" counter at the top of each page and very clear, obvious "Next" and "Back" buttons told users where they were and what they should do next.

Users also claimed to want more freedom to select their recipes, but they were quickly overwhelmed by the enormous number of options, so we put in a simple and engaging way to select from recommended recipes while still allowing users to access the full collection with the click of one button.

Users were confused by how to change their meal plan
Recommended recipe carousels made choosing a meal plan fun and easy to understand
One common problem was that users asked for a couple of features that were actually already in the product. The features themselves were very useful and well-designed; they just weren't discoverable enough. By changing the location of these features, we made them more obvious to people.

Most importantly, we didn't just jump to Photoshop mockups of the design. Instead, we created several early sketches before moving to interactive wireframes, which we tested and iterated on with current users. In this case, I created the interactive wireframes in HTML and JavaScript. While they were all grayscale with no visual design, they worked. Users could perform the most important actions in them, like stepping through the application, adding meals to their meal plan, and editing recipes. This made participants feel like they were using an actual product so that they could comment not just on the look and feel but on the actual interactions.

By the end of the iterations and tests, every single one of the users liked the new version better than the old, and we had a very good idea why.

Did we make it perfect? No. Perfection takes an awful lot of time and too often fails to be perfect for the intended users.

Instead, we identified several areas we'd like to optimize and iterate on going forward. But we also decided that it was better to release a very good version and continue improving it, rather than aim for absolute perfection and never get it out the door.

The redesign removed all of the major pain points that we'd identified in the testing and created a much simpler, more engaging interface that would allow the team to add features going forward. It improved the user experience and set the stage for lots more iteration and experimentation in the future. In fact, the team currently has several more exciting experiments running!

Key Takeaways:
  • Interactive prototypes and iterative testing let you improve the design quickly before you ever get to the coding stage.
  • Targeting only the confusing parts of the interface for redesign reduces the number of things you need to rebuild and helps make both design and development faster.
  • Lean design is about improving the user experience iteratively! Fixing the biggest user problems first means getting an improved experience to users quickly and optimizing later based on feedback and metrics.
The Metrics

Like any good lean startup, we released the new design in an a/b test with new users. We had a feeling it would be better, but we needed to know whether we were right. We also wanted to make sure there weren't any small problems we'd overlooked that might have big consequences.

After running for about 6 weeks and a few thousand people, we had our statistically significant answer: a 77% increase in the number of new users who were making it all the way through activation.

My entire involvement with the project to do the research, design, and usability testing was just under 90 hours spread over about 6 weeks.

Key Takeaway: Design - even major redesigns - can be part of an agile, lean startup environment, if done in an efficient way with a lot of iteration and customer involvement.



Laura Klein has been working in Silicon Valley as both an engineer and a UX professional for the last 15 years. She currently consults with lean startups to help them make their products easier to use. She frequently blogs about design, usability, metrics, and product management at Users Know. You can follow her on Twitter at @lauraklein.
blog comments powered by Disqus