Saturday, November 22, 2008

Net Promoter Score: an operational tool to measure customer satisfaction

Cover of
I've mentioned Net Promoter Score (NPS) in a few previous posts, but haven't had a chance to describe it in detail yet. It is an essential lean startup tool that combines seemingly irreconcilable attributes: it provides operational, actionable, real-time feedback that is truly representative of your customers' experience as a whole. It does it all by asking your customers just one magic question.

In this post I'll talk about why NPS is needed, how it works, and show you how to get started with it. I'll also reveal the Net Promoter Score for this blog, based on the data you've given me so far.

How can you measure customer satisfaction?
Other methods for collecting data about customers have obvious drawbacks. Doing in-depth customer research, with long questionnaires with detailed demographic and psychograpic breakdowns, is very helpful for long-range planning, interaction design and, most importantly, creating customer archetypes. But it's not immediately actionable, and it's far too slow to be a regular part of your decision loop.

At the other extreme, there's the classic A/B split-test, which provides nearly instantaneous feedback on customer adoption of any given feature. If your process for creating split-tests is extremely light (for example, it requires only one line of code), you can build a culture of lightweight experimentation that allows you to audition many different ideas, and see what works. But split-tests also have their drawbacks. They can't give you a holistic view, because they only tell you how your customers reacted to that specific test.

You could conduct an in-person usability test, which is very useful for getting a view of how actual people perceive the totality of your product. But that, too, is limited, because you are relying on a very small sample, from which you can only extrapolate broad trends. A major usability problem is probably experienced similarly by all people, but the absence of such a defect doesn't tell you much about how well you are doing.

Net Promoter Score
NPS is a methodology that comes out of the service industry. It involves using a simple tracking survey to constantly get feedback from active customers. It is described in detail by Fred Reichheld in his book The Ultimate Question: Driving Good Profits and True Growth. The tracking survey asks one simple question: How likely are you to recommend Product X to a friend or colleague? The answer is then put through a formula to give you a single overall score that tells you how well you are doing at satisfying your customers. Both the question and formula are the results of a lot of research that claims that this methodology can predict the success of companies over the long-term.

There's a lot of controversy surrounding NPS in the customer research community, and I don't want to recapitulate it here. I think it's important to acknowledge, though, that lots of smart people don't agree with the specific question that NPS asks, or the specific formula used to calculate the score. For most startups, though, I think these objections can safely be ignored, becuase there is absolutely no controversy about the core idea that a regular and simple tracking survey can give you customer insight.

Don't let the perfect be the enemy of the good. If you don't like the NPS question or scoring system, feel free to use your own. I think any reasonably neutral approach will give you valuable data. Still, if you're open to it, I recommend you give NPS a try. It's certainly worked for me.

How to get started with NPS
For those that want to follow the NPS methodology, I will walk you through how to integrate it into your company, including how to design the survey, how to collect the answers, and how to calculate your score. Because the book is chock-full of examples of how to do this in older industries, I will focus on my experience integrating NPS into an online service, although it should be noted that it works equally well if your primary contact with customers is through a different channel, such as the telephone.

Designing the survey
The NPS question itself (again, "How likely are you to recommend X to a friend or colleague?") is usually asked on a 0-10 point scale. It's important to let people know that 10 reperesents "most likely" and 0 represents "least likely" but it's also important not to use words like promoter or detractor anywhere in the survey itself.

The hardest part about creating an NPS survey is to resist the urge to load it up with lots of questions. The more questions you ask, the lower your response rate, and the more you bias your results towards more-engaged customers. The whole goal of NPS is to get your promoters and your detractors alike to answer the question, and this requires that you not ask for too much of their time. Limit yourself to two questions: the official NPS question, and exactly one follow-up. Options for the follow-up could be a different question on a 10-point scale, or just an open ended question asking why they chose the rating that they did. Another possibility is to ask "If you are open to answering some follow-up questions, would you leave your phone number?" or other contact info. That would let you talk to some actual detractors, and get a qualitative sense of what they are thinking, for example.

For an online service, just host the survey on a webpage with as little branding or decoration as possible. Because you want to be able to produce real-time graphs and results, this is one circumstance where I recommend you build the survey yourself, versus using an off-the-shelf hosted survey tool. Just dump the results in a database as you get them, and let your reports calculate scores in real-time.

Collecting the answers
Once you have the survey up and running, you need to design a program to have customers take it on a regular basis. Here's how I've set it up in the past. Pick a target number of customers to take the survey every day. Even if you have a very large community, I don't think this number needs to be higher than 100. Even just 10 might be enough. Build a batch process (using GearMan, cron, or whatever you use for offline processing) whose job is to send out invites to the survey.

Use whatever communication channel you normally rely on for notifying your customers. Email is great; of course, at IMVU, we had our own internal notification system. Either way, have the process gradually ramp up the number of outstanding invitations throughout the day, stopping when it's achieved 100 responses. This way, no matter what the response rate, you'll get a consistent amount of data. I also recommend that you give each invitation a unique code, so that you don't get random people taking the survey and biasing the results. I'd also recommend you let each invite expire, for the same reason.

Choose the people to invite to the survey according to a consistent formula every day. I recommend a simple lottery among people who have used your product that same day. You want to catch people when their impression of your product is fresh - even a few days can be enough to invalidate their reactions. Don't worry about surveying churned customers; you need to use a different methodology to reach them. I also normally exclude anyone from being invited to take the survey more than once in any given time period (you can use a month, six months, anything you think is appropriate).

Calculate your score
Your NPS score is derived in three steps:
  1. Divide all responses into three buckets: promoters, detractors, and others. Promoters are anyone who chose 9 or 10 on the "likely to recommend scale" and detractors are those who chose any number from 0-6.
  2. Figure out the percentage of respondants that fall into the promoter and detractor buckets.
  3. Subtract your detractor percentage from your promoter percentage. The result is your score. Thus, NPS = P% - D%.
You can then compare your score to people in other industries. Any positive score is good news, and a score higher than +50 is considered exceptional. Here are a few example scores taken from the official Net Promoter website:

Apple 79
Adobe 46
Google
73
Barnes & Noble online
74
American Express
47
Verizon
10
DIRECTV
20

Of course, the most important thing to do with your NPS score is to track it on a regular basis. I used to look at two NPS-related graphs on a regular basis: the NPS score itself, and the response rate to the survey request. These numbers were remarkably stable over time, which, naturally, we didn't want to believe. In fact, there were some definite skeptics about whether they measured anything of value at all, since it is always dismaying to get data that says the changes you're making to your product are not affecting customer satisfaction one way or the other.

However, at IMVU one summer, we had a major catastrophe. We made some changes to our service that wound up alienating a large number of customers. Even worse, the way we chose to respond to this event was terrible, too. We clumsily gave our community the idea that we didn't take them seriously, and weren't interested in listening to their complaints. In other words, we committed the one cardinal sin of community management. Yikes.

It took us months to realize what we had done, and to eventually apologize and win back the trust of those customers we'd alienated. The whole episode cost us hundreds of thousands of dollars in lost revenue. In fact, it was the revenue trends that eventually alerted us to the magnitude of the problem. Unfortunately, revenue a trailing indicator. Our response time to the crisis was much too slow, and as part of the post-mortem analysis of why, I took a look at the various metrics that all took a precipitous turn for the worse during that summer. Of everything we measured, it was Net Promoter Score that plunged first. It dropped down to an all-time low, and stayed there for the entire duration of the crisis, while other metrics gradually came down over time.

After that, we stopped being skeptical and started to pay very serious attention to changes in our NPS. In fact, I didn't consider the crisis resolved until our NPS peaked above our previous highs.

Calculating the NPS of Lessons Learned
I promised that I would reveal the NPS of this blog, which I recently took a snapshot of by offering a survey in a previous post. Here's how the responses break down, based on the first 100 people who answered the question:
  • Number of promoters: 47
  • Number of detractors: 22
  • NPS: 25
Now, I don't have any other blogs to compare this score to. Plus, the way I offered the survey (just putting a link in a single post), the fact that I didn't target people specifically to take the survey, and the fact that the invite was impersonal, are all deeply flawed. Still, all things considered, I'm pretty happy with the result. Of course, now that I've described the methodology in detail, I've probably poisoned the well for taking future unbiased samples. But that's a small price to pay for having the opportunity to share the magic of NPS.

I hope you'll find it useful. If you do, come on back and post a comment letting us all know how it turned out.


Reblog this post [with Zemanta]

8 comments:

  1. Great, great post.

    A couple of additional thoughts:

    a. Build the response flow such that it involves minimal user interaction. Ideally a user should be able to respond with a single click (think links for 1-10 rather than a radio buttons and a submit button). For these sorts of surveys, it can really improve response rates. For this reason, I think email is actually not such a great method unless your service relies on it heavily for its core functionality.

    b. NPS is great tool to assess the value your service is providing to a certain customer-type, but not necessarily its value to (what you consider) your overall customer-base (i.e. it's good for climbing a mountain, not so much for selecting it). Depending on your service, it may make sense to design the invitations such that they hit on all the user constituents.

    c. finally, avoid sending an invitation to the same user twice in proximity (space them at least once/6month). Especially early-on, when the user-base is small, build your sample-size to reflect that.

    ReplyDelete
  2. Eric,

    I love your blog and I've learned a lot from it. Anyways, giving you're interest and work in blogging, you might find this interesting. It's a site that does a Myers-Briggs analysis on bloggers based on what they write:

    http://www.typealyzer.com

    Quite fun and it makes sense that you come up as a scientist. If you're interested in how that compares to my little blog...
    http://tinyurl.com/AlignAssessment

    Keep up the great posts,

    Andy

    ReplyDelete
  3. Nice post.

    I too beleive that measuring a 'recomender' score is a terric way to quickly manage customer loyalty. Unfortunalety, too many comanies today still don't do any measuring at all. One of the reasons that simple measures like NPS are successful is because they are, well, simple. Execs and managers are so busy that 'simple' really does = 'doable.'

    I'm in the loyalty business (full disclosure: Allegiance is my employer), and NPS is something that we help clients incorporate into their reporting and dashboards, along with other measurements. I regularly hear from clients that NPS is important, but often not enough. Personally, I think NPS is a good start, but adding just 2-3 other key measures really rounds out the view. Engagement level is perhaps the next most important measure to add.

    Overall, keep it simple to ensure it gets done, and, measureing something is better than doing nothing at all. I'm a big believer in increasing customer loyalty, especially during tough economic times. It's a great time to increase revenues without spending money on acquiring new customers.

    ReplyDelete
  4. The Net Promoter score does identify the level of Loyalty that might exist in your customer base. But it does not give you the tools to improve your Customer Loyalty over the long run. Brookeside can do that.

    ReplyDelete
  5. Hi Eric,

    I started off by reading a recent post of yours in regards to a cardinal sin of community management and ended up taking some time to poke through the rest of your blog, too, which is how I came across this post.

    I am intrigued by the discussion of tools used to measure customer satisfaction (and perhaps consumer trends and directions and overall feelings)...

    but I am honestly confused as to how it took a major loss of revenue for the realization to be made that -something- was not going well in the IMVU consumerbase.

    Generally when there is unease in the IMVU userbase, the first place you will see it is in the forums and in particular, in the Content Creator forum, the Suggestions forum, and the Miscellany forum, or - if the issue is related to a new release of something or a change to the client or previewer or something - the Announcements and Bugs forum.

    The IMVU userbase is NOT quiet when it comes to expressing unease as users frequently seek each other out for both companionship, someone to talk these issues with over, and to get either confirmation or denial that the problem they are experiencing exists.

    But perhaps I am making an assumption and am talking about one thing while you are talking about another.

    I know you probably can't say exactly which issue you were referring to, but was this issue 'behind the doors' so to speak or was this a public issue that up-to-date IMVU-ers would have been familiar with (and thus actively uneasy about)?

    Thanks for your thoughts,

    ReplyDelete
  6. Hello Eric,
    I've been pouring through your blog since attending your event at MIT providing a overview of the core competencies of a "lean startup".
    Coming across your post about the "net promoter" score was a surprise but I had to reach out and tell you that this was the inspiration of my new startup called SOCIALtality. The SOCIALtalty scoring methodology is the Net Promoter score for the new digital age measuring a company's use of social media to become more customer centric.
    I thoroughly appreciate your "giving back" to increase the likelihood for success for startups.
    Best,
    Wendy
    Founder & CEO, SOCIALtality

    ReplyDelete
  7. Hi, one thing I'm not clear about - if the survey is not 100% about NPS - does it make sense to just include a simple question with 3 answers

    "If you were asked to provide a report on company x, which one of the following statements is most likely to be included in your report?

    I would recommend x without hesitation
    I would recommend x with a few qualifications
    I would not recommend x

    ReplyDelete