NPS Is Cliantro

I’m often asked, “If you could choose one metric to measure customer experience, which one would you choose?” It’s always an interesting question given that rarely is anything in life such a narrow choice. I feel like they’re channeling Tolkien and seeking the one metric that will rule them all.

Not wanting to disappoint, here is my recommendation for the one best metric for customer experience:

None.

That’s right. If you can only choose one, choose none.

If your goal is to understand customer experience and you can only look at one thing, focus on what your customers are telling you with their words and their actions. Read customer feedback. Have regular conversations with your front line. Build customer focus groups. Absorb the sentiment of your social media.

Don’t ever assume you can discern what your customers are thinking from one metric alone. The customers’ actual voices matter more than any number.

All cheekiness aside, metrics are a wonderful asset to help your company understand the overall customer experience. But more important than choosing specific customer experience metrics will be how you build a recipe of metrics that combine with what your customers tell you through other means.

Below, you’ll find seven customer experience metrics you should consider tracking, split into three categories: basic, next-level, and intriguing.

Basic customer experience metrics

1. Net Promoter Score (NPS)

I am now going to make the most controversial statement regarding CX metrics: You should use NPS as a CX metric. (I see a group of you picking up your torches and pitchforks.)

I’d now like to make another controversial statement regarding CX metrics: You should not use NPS as a CX metric. (One group sets down their torches which are then immediately picked up by the other group.)

I find it intriguing how passionate CX experts get about NPS. But first, let’s talk about cilantro.

There are people who love cilantro and won’t touch the guac if cilantro isn’t present. There are others who think cilantro tastes like soap and question any individual who would even walk near it in a grocery store. Finally, there are people who enjoy it but don’t seem to care when it’s skipped in their salsa.

NPS is cilantro.

The reality is that NPS, like many other indicators of customer experience, is a fine ingredient when used in the right amount in the right recipe. If your company isn’t a fan of the ingredient, that’s fine too.

I recommend NPS because people are familiar with it. There’s something special about offering a personal recommendation, so asking a customer if they would recommend your company to friends or family often causes them to think about their experience in a deeper way.

The most common way to collect NPS is through Voice of the Customer (VOC) tools, usually email, text, or phone surveys. These surveys ask the NPS score question and, if done right, also have an opportunity for the customer to fill out an open-ended response. (That’s where the real value lies.)

The market for NPS tools is filled with options. This blog post purposely stays away from recommending particular vendors, but you’ll be able to find options from the very inexpensive, basic-needs options to the industry leaders in product capabilities (and also complexity).

NPS differs from classic Customer Satisfaction Score in how it’s calculated. On a 1-10 point scale, responses are separated into Promoters (9-10), Passives (7-8), and Detractors (1-6). You subtract the percentage of Detractors from the percentage of Promoters, so the calculation ranges from -100 to 100.

With Promoters only being the top two responses, you can see how the calculation method does make it “harder” to get a high numeric score. However, NPS as an absolute number is less important than understanding how you compare to other companies and industries, but — more importantly — how you trend over time.

A company should be more interested in their NPS growth from one quarter to the next instead of comparing their NPS to the industry average for a particular month. It’s helpful to track your Promoter, Passive, and Detractor trend lines separately to see where you can apply your company’s focus to improve the overall experience.

Use this metric to point you to areas where you’re doing well in the customers’ eyes and where you need improvement. But don’t just slap the score on a monthly executive dashboard and clap when it goes up and cry when it goes down. That’s just “Survey and Score.”

Instead, use NPS to “Listen and Act.” NPS can be a foundational piece of a true VOC program where you restore client relationships, improve your company’s delivery of experience (i.e., ops and processes), and take feedback to help coach, celebrate, and manage performance.

2. Customer Satisfaction Score (CSAT)

In the U.S., college football finishes its season with a series of “bowl” games, which lead up to the final national championship. While these (much like CX metrics) have now proliferated to over 40 different bowl games, the Rose Bowl in Pasadena, California holds the moniker of “Granddaddy of Them All” as it is the oldest currently operating bowl game.

Customer satisfaction score (CSAT) is the Rose Bowl of CX metrics. Less controversial than NPS, CSAT simply asks a customer how satisfied they are, typically on a scale from 1-5. Many CX experts view CSAT as dated, but I disagree. It’s one ingredient to ask of your customers.

However, I do not recommend using CSAT and NPS together. I’ve rarely found a significant difference in overall customer experience when asking both questions of customers.

The argument against CSAT often comes down to semantics. Those advising against using it suggest that merely satisfying a customer sets the bar too low, positing that satisfied customers will not generate the same lifetime value and loyalty as a Promoter from the NPS scale.

The same VOC tools mentioned above can gather this. Most of us have seen this in other settings, some odder than others.

bathroom satisfaction.png

While I understand the idea behind it, I can’t seem to get myself to provide my satisfaction rating coming out of an airport bathroom.

One suggestion: While I’m not voting in a bathroom, I do like the image above as I’m a big fan of excluding the “neutral” option in the CSAT question. Instead of 1-5, offering 1-4 forces the customer to decide if they were satisfied or not.

3. Sales/Revenue

I won’t spend much time on this one. Let me simply offer that a true basic CX metric is “Did your customers spend money with you?” If they did (even begrudgingly so), you delivered something right.

But don’t ever rely just on this metric. Today’s “frustrated but willing to spend” customer is tomorrow’s ex-customer as they move to an upstart competitor. But fundamentally, using sales and revenue as a CX metric helps ground your CX initiatives to business fundamentals.

Next-level customer experience metrics

If you stop with the metrics above, you’ll have a nice foundation for understanding your customer experience. But going to that next level using the CX metrics below helps unlock additional insights.

4. Customer Effort Score (CES)

So much of what makes up a customer’s perception of your company stems from how easy (or how hard) it is to work with your company. I can be delighted by a wonderful experience, but if I had to move heaven and hell to make it come together for me, I’m unlikely to repeat it.

While there are ways to derive customer effort (I’ll mention one below), note I said “perception” above. So ask the customer, “How easy is it to do business with us?”, with a scale from 1-10. That allows them to determine their rating, rather than you assuming what’s easy or hard.

Similar to every metric mentioned here, it’s about the open-ended responses. I also suggest asking, “Why did you choose that answer?” after asking for their score. A much smaller percentage will respond, but the insight you gain helps drive your ability to create great experiences going forward.