Sunday, January 30, 2011

Lean Startup junkies

Offered without comment. Enjoy.

Warning: NSFW



(I did not create this video and I have no idea who did. Whoever you are, get in touch and I will heap copious praise upon you.)

Tuesday, January 18, 2011

Case Study: UX, Design, and Food on the Table

(One of the common questions I hear is how to reconcile design and user experience (UX) methods with the Lean Startup. To answer, I asked one of my favorite designers to write a case study illustrating the way they work, taking us step-by-step through a real life redesign.

This is something of an IMVU reunion. The attendees at sllconf 2010 were wowed by Food on the Table's presentation. If you weren't there, be sure to watch the video. Manuel Rosso was IMVU's first VP of Marketing, and is now CEO of Food on the Table, one of the leading lean startups in Austin. I first met Laura Klein when we had the good fortune of hiring her at IMVU to join our interaction design team. Since then, she's gone on to become one of the leading experts implementing UX and design in lean startups. 

In this case study, Laura takes us inside the design process in a real live startup. I hope you'll find it illuminating. -Eric)

A lot of people ask me whether design fits into the lean startup process. They're concerned that if they do any research or design up front that they will end up in a waterfall environment.

This is simply not true. Even the leanest of startups can benefit from design and user research. The following is a great example of how they can work together.

A couple of months ago, Manuel Rosso, the CEO of Food on the Table came to me with a problem. He had a product with a great value proposition and thousands of passionate customers. That wasn't the problem. The problem was activation.

As a bit of background, Food on the Table helps people plan meals for their families around what is on sale in their local grocery stores. The team defined an activated user as someone who made it through all the steps of the first time user experience: selecting a grocery store, indicating food preferences, picking recipes, and printing a grocery list.

Users who made it through activation loved the product, but too many first time users were getting lost and never getting all the way to the end.

Identifying The Problem

More than any startup I've worked with, Food on the Table embraces the lean startup methodology. They release early and often. They get tons of feedback from their users. And, most importantly, they measure and a/b test absolutely everything.

Because of their dedication to metrics, they knew all the details of their registration funnel and subsequent user journey. This meant that they knew exactly how many people weren't finishing activation, and they knew that number was higher than they wanted.

Unfortunately, they fell into a trap that far too many startups fall into at some point: they tried to measure their way out of the problem. They would look at a metric, spot a problem, come up with an idea for how to fix it, release a change, and test it. But the needle wasn't moving.

After a couple of months, Manuel had a realization. The team had always been dedicated to listening to users. But as they added new features, their conversations with users had changed - they became more narrowly focused on new features and whether each individual change was usable and useful. Somewhere along the way, they'd stopped observing the entire user experience, from end to end. This didn't last very long - maybe a month or two, but it was long enough to cause problems.

As soon as he realized what had happened, Manuel went back to talking directly to users about their overall experiences rather than just doing targeted usability tests, and within a few hours he knew what had gone wrong. Even though the new features were great in isolation, they were making the overall interface too complicated. New users were simply getting lost on their way to activation.

Now that they knew generally why they were having the problem, Manuel decided he needed a designer to identify the exact pain points and come up with a way to simplify the interface without losing any of the features.

Key Takeaways:
  • Don't try to measure your way out of a problem. Metrics do a great job of telling you what your problem is, but only listening to and observing your users can tell you why they're having trouble.
  • When you're moving fast enough, a product can become confusing in a surprisingly short amount of time. Make sure you're regularly observing the user experience.
  • Adding a new feature can be useful, but it can also clutter up an interface. Good design helps you offer more functionality with less complexity.
Getting an Overview of the Product

When I first came on board, the team had several different experiments going, including a couple of different competing flows. I needed to get a quick overview of the entire user experience in order to understand what was working and what wasn't.

Of course, the best way to do that is to watch new and current customers use the product. In the old days, I would have recruited test participants, brought them into an office, and run usability sessions. It would have taken a couple of weeks.

Not anymore! I scheduled UserTesting.com sessions, making sure that I got participants in all the main branches of the experiments. Within a few hours, I had a dozen 15 minute videos of people using the product. The entire process, including analysis, took about one full day.

Meanwhile, we set up several remote sessions with current users and used GoToMeeting to run fast observational sessions in order to understand the experience of active users. That took another day.

Key Takeaway: Get feedback fast. Online tools like GoToMeeting and UserTesting.com (and about a hundred others) can help you understand the real user experience quickly and cheaply.

Low Hanging Fruit

Once we had a good idea of the major pain points, we decided to split the design changes into two parts: fixing low hanging fruit and making larger, structural changes to the flow. Obviously, we weren't going to let engineering sit around on their hands while we made major design changes.

The most important reason to do this was that some of the biggest problems for users were easy to fix technically and could be accomplished with almost no design input whatsoever.
For example, in one unsuccessful branch of a test, users saw a button that would allow them to add a recipe to a meal plan. When user test participants within the office pressed the button, it would very quickly add the recipe to the meal plan, and users had no problem understanding it. When we observed users pressing the button on their own computers with normal home broadband connections, the button took a few seconds to register the click.

Of course, this meant that users would click the button over and over, since they were getting no feedback. When the script returned, the user would often have added the recipe to their meal plan several times, which wasn't what they meant to do.

This was, by all accounts, a bad user experience. Why wasn't it caught earlier?

Well, as is the case with all software companies, the computers and bandwidth in the office were much better than the typical user's setup, so nobody saw the problem until we watched actual users in their natural environments.

What was the fix? We put in a "wait" spinner and disabled the button while the script was processing. It took literally minutes to implement and delivered a statistically significant improvement in the performance of that branch of the experiment.

Giving immediate feedback drastically reduced user error
Manuel told me that, immediately after that experience, the team added a very old, slow computer to the office and recently caught a nasty problem that could add 40 seconds to page load times. Needless to say, all usability testing within the office is now done on the slowest machine.

Key Takeaways:
  • Sometimes big user problems don't require big solutions.
  • To truly understand what your user is experiencing, you have to understand the user's environment.
  • Sometimes an entire branch of an experiment can be killed by one tiny bug. If your metrics are surprising, do some qualitative research to figure out why!
A Redesign

While the engineering team worked on the low-hanging fruit, we started the redesign. But we didn't just chuck everything out. We started from the current design and iterated. We identified a few critical areas that were making the experience confusing and fixed those.

For example, we started with the observation that people were doing ok for the first couple of screens, but then they were getting confused about what they were supposed to do next. A simple "Step" counter at the top of each page and very clear, obvious "Next" and "Back" buttons told users where they were and what they should do next.

Users also claimed to want more freedom to select their recipes, but they were quickly overwhelmed by the enormous number of options, so we put in a simple and engaging way to select from recommended recipes while still allowing users to access the full collection with the click of one button.

Users were confused by how to change their meal plan
Recommended recipe carousels made choosing a meal plan fun and easy to understand
One common problem was that users asked for a couple of features that were actually already in the product. The features themselves were very useful and well-designed; they just weren't discoverable enough. By changing the location of these features, we made them more obvious to people.

Most importantly, we didn't just jump to Photoshop mockups of the design. Instead, we created several early sketches before moving to interactive wireframes, which we tested and iterated on with current users. In this case, I created the interactive wireframes in HTML and JavaScript. While they were all grayscale with no visual design, they worked. Users could perform the most important actions in them, like stepping through the application, adding meals to their meal plan, and editing recipes. This made participants feel like they were using an actual product so that they could comment not just on the look and feel but on the actual interactions.

By the end of the iterations and tests, every single one of the users liked the new version better than the old, and we had a very good idea why.

Did we make it perfect? No. Perfection takes an awful lot of time and too often fails to be perfect for the intended users.

Instead, we identified several areas we'd like to optimize and iterate on going forward. But we also decided that it was better to release a very good version and continue improving it, rather than aim for absolute perfection and never get it out the door.

The redesign removed all of the major pain points that we'd identified in the testing and created a much simpler, more engaging interface that would allow the team to add features going forward. It improved the user experience and set the stage for lots more iteration and experimentation in the future. In fact, the team currently has several more exciting experiments running!

Key Takeaways:
  • Interactive prototypes and iterative testing let you improve the design quickly before you ever get to the coding stage.
  • Targeting only the confusing parts of the interface for redesign reduces the number of things you need to rebuild and helps make both design and development faster.
  • Lean design is about improving the user experience iteratively! Fixing the biggest user problems first means getting an improved experience to users quickly and optimizing later based on feedback and metrics.
The Metrics

Like any good lean startup, we released the new design in an a/b test with new users. We had a feeling it would be better, but we needed to know whether we were right. We also wanted to make sure there weren't any small problems we'd overlooked that might have big consequences.

After running for about 6 weeks and a few thousand people, we had our statistically significant answer: a 77% increase in the number of new users who were making it all the way through activation.

My entire involvement with the project to do the research, design, and usability testing was just under 90 hours spread over about 6 weeks.

Key Takeaway: Design - even major redesigns - can be part of an agile, lean startup environment, if done in an efficient way with a lot of iteration and customer involvement.



Laura Klein has been working in Silicon Valley as both an engineer and a UX professional for the last 15 years. She currently consults with lean startups to help them make their products easier to use. She frequently blogs about design, usability, metrics, and product management at Users Know. You can follow her on Twitter at @lauraklein.

Monday, January 10, 2011

Why we need to teach MBA’s about modern entrepreneurship (and what Harvard Business School is doing about it)

This week, the startup tribe from Harvard Business School is making their annual trek to Silicon Valley. They’ll hear from a variety of experts and get to see many startups firsthand. While they’re here, I’m sure they’ll be received warmly. But I don’t think they’d be too happy to hear what gets said about them when they leave the room.

It’s a common refrain around Silicon Valley to disparage the role of MBA’s in entrepreneurship. We still have some collective scar tissue: the idea conjures up the hordes of dot-com hopefuls that descended on VC’s and angel investors with little more than a business plan. Even today, I routinely hear MBA’s advised to remove the degree from their resume when applying to startup jobs. We also cavalierly lampoon the “suits” that get brought in to run startups after the founders are fired by callous VC’s. I used to call them “executioners” because their attempts to “execute” the standard general management playbook in the startup context of extreme uncertainty usually led to disaster.

But this anti-MBA bias harms the entrepreneurship ecosystem and limits opportunities even for startups that don’t employ a single MBA.

I spend a lot of time thinking and writing about what a theory of entrepreneurship needs to do. One of my beliefs is that such a theory should be addressed to entrepreneurs and the people who hold them accountable. It’s this latter criteria that I think tends to get overlooked in most writing about entrepreneurs.

Who holds entrepreneurs accountable? One obvious answer is startup investors: angels and venture capitalists - many of whom have MBA’s. They have the ultimate responsibility in a venture-backed company of deciding whether to fire the founders and bring in “professional management.” I believe one of the reasons that this often goes badly is that the investors have nothing more than standard general management tools for evaluating the startup’s success. For example, when a startup is making more money month after month, I often hear VC’s say something to the effect of, “well, you can’t argue with success.” I hear similar things for pre-revenue startups that are on schedule, on time, and on budget - even though they are busy building something that nobody wants. (In fact, this crisis was at the heart of Steve Blank’s original impetus to develop customer development as an alternative set of milestones to use for startups.)

I also frequently see the reverse. General management is supposed to be orderly, “strategic” and mostly calm. I have seen founders replaced because their style seemed too chaotic, even though what’s really happening is that they are operating at startup speed. Pivots are disorienting, but necessary. Except when a startup is busy “pivoting” all the time, running around in circles. That's a waste of time. How do you tell the difference? General management doesn't have a good answer. As a result, founders get removed prematurely or even entirely exiled.

And there’s a worse fate, too. Far too many venture-backed companies correctly conclude that the founders are being counter-productive during their high-growth phase. But neither they nor the founders can think of anything for them to do: the MBA’s think they should be purged and the founders think they should be in charge. It rarely occurs to either party that the founders could be busy inventing the next disruptive product, but doing so in a non-disruptive way. As a result, many of these companies get caught by surprise when their optimization activities hit a plateau and competitors who have a true portfolio approach race past them. Facebook vs. Myspace, anyone?

These dynamics harm startups at all stages, because they pressure founders to engage in “success theatre” – trying to make themselves look successful by general management standards by focusing on vanity metrics, product milestones, and whiz-bang demos. In entrepreneur circles, this is goes by the innocuous-sounding term “board management” – but it is actually a terrible waste of energy and focus.

And it’s not just investors who are having trouble. The tech press loves to celebrate when startups get acquired, and founders make lots of money. But who do you think makes those acquisition decisions? In many cases, it’s MBA’s inside large companies. Far too many of these acquisitions go nowhere. Companies are acquired for hundreds of millions only to be worth tens of millions a few years later. I know plenty of people that have nothing but disdain for the “suits” who make these decisions. These entrepreneurs are only too happy to engage in success theatre, cash out, and move on to the next venture. What happens after they leave is of little concern. If it goes badly, we all know it’s the behemoth who will get the blame.

But that’s damaging, too. The core moral tenet of capitalism is that voluntary exchange is the ultimate test of whether an economic transaction is value creating or value destroying. Fraud, deception, and dishonesty undermine this moral calculus. Vanity metrics and success theater are in a moral grey area; they are masking the fact that some of our industry’s most “successful” ventures are actually value destroying. Even worse, this breeds tremendous mistrust.

Back in my IMVU days, I met with an investment banker who wanted to help us understand the M&A landscape. When we walked him through our business model and results, he explained that our options would be limited, because the main M&A community was feeling burned by companies like ours. He explained, “the big companies have bought a bunch companies with a client download or engagement business model, like Xfire and Neopets, and haven’t seen the returns they had hoped. Therefore, if you want to sell IMVU one day, you’ll need to abandon your business model, even though it’s generating a lot of revenue per customer. You’d be better off just selling ads.” There’s a lot wrong with statement, not the least of which is that “client download” is not a business model and that he was advocating that we abandon a real business model for an unprofitable one. But I believe he was representing his clients’ honest beliefs; they were confused about how to classify startups. When they get burned by a wave of acquisitions that turn out to be value destroying, the rest of us suffer the resulting lack of liquidity that is so important to the startup ecosystem.

It’s tempting to blame the MBA’s for all these mistakes. But I think the root cause of these mistakes is the fact that most MBA’s are not adequately educated about entrepreneurship. The problem afflicts general managers who try to innovate within big companies, too. In fact, I hope longtime readers will recognize these as the exact same mistakes that afflict us as entrepreneurs when we try to hold ourselves and our teams accountable. Are we making progress? Is what I’m working on creating value? What should I work on next? These are the enduring startup questions.

That’s why I am so excited about a new course that is debuting this year at Harvard Business School.

I’ll be honest. I get a lot of raised eyebrows here in Silicon Valley when people hear that I am an Entrepreneur in Residence at Harvard Business School – and not just because I continue to be physically resident here in San Francisco. Some people here are skeptical of spending time with MBA’s. One tweet read, “well, if HBS is investing in the lean startup we know it has jumped the shark.”

I’m glad to be able to do something about the divide between entrepreneurs and MBA's. As I’ve written before, academia has an important role to play in the new startup movement that’s afoot: documenting case studies, identifying emerging new practice, and doing original research to validate or refute theories. Harvard is, in some ways, a late entrant to this project; important work has taken place at Stanford in both the business and engineering schools; Berkeley's Haas School of Businesses was the place where Steve Blank first taught customer development.

Professor Tom Eisenmann is pioneering a novel approach at HBS, with a new class called Launching Technology Ventures:
Launching Technology Ventures uses case studies to examine lean startup practices. LTV focuses on the integration of marketing and engineering functions and emphasizes implementation rather than strategy formulation issues. The course does not examine financing options or the composition of founding teams. LTV draws heavily on the ideas of Eric Ries, Steve Blank, Marty Cagan and other practitioners.
The class debuts in a few weeks. I’m excited to be joining him for one of the first sessions on January 26 (for those who don’t attend HBS, I’ll also be doing a public event at the Boston Lean Startup Circle). Prof. Eisenmann has just published a series of blog posts describing the class, and they include three new resources that will benefit entrepreneurs and MBA’s everywhere:

  1. A comprehensive reading list of the best sources on the new entrepreneurship: books, blogs, ebooks - everything. It’s the best such compilation I’ve ever seen. It should be considered the definitive (and mandatory) reading list for anyone who wants to understand modern entrepreneurship.
  2. A series of new business cases documenting Lean Startup principles in a variety of companies. Many will be familiar to those who attended sllconf: IMVU, Dropbox, Aardvark. But he’s also documented cases of these principles applied in a variety of other situations. For example, for years Steve Blank and I have been saying things like, “if you’re working on a cure for cancer, these rules don’t apply.” But it turns out that this is not exactly true, as a company called Predictive Biosciences discovered. It’s a phenomenal case.
  3. A compilation of tools and skills. The class also requires that the students explore – and master – some specific tools and techniques that entrepreneurs use every day. You cannot learn about entrepreneurship just at the blackboard. You have to get your hands dirty. This post outlines the skills that MBA’s who want to understand entrepreneurship have to master. For example, in my work with MBA’s who want to go into software entrepreneurship, I often encourage them to learn how to program. Managing a software company without knowing how to program is just as ridiculous as a factory manager who refuses to walk the factory floor. If you don’t understand how work is done, you cannot manage it. The same logic applies across a whole host of skills, which is why this compilation is so important - and why it'll be a long term project. (I'm hoping we'll get this hosted on a wiki soon.)
This class is just the beginning. If you’re an MBA student, I strongly encourage you take a class like this one – even if you’re not planning to become an entrepreneur yourself. First of all, if you pursue a career in general management, it’s extremely likely you’re going to find yourself managing entrepreneurs. Second of all, you never know – you might find yourself facing extreme uncertainty in your job, no matter what industry or company you work for. If you’re at Harvard or one of the schools where Steve Blank teaches, consider yourself lucky. If you’re not, advocate for your program to try something new.

And for my friends in Silicon Valley, I hope you get excited about this, too. Imagine a world where MBA’s are actually helpful and not just “suits” that get in our way. Wouldn’t that be a cool hack?