The touchpoint NPS oxymoron - The likelihood to recommend a brand isn’t built on a single interaction
Traditionally, customer research and insights teams were tasked with capturing the voice of the customer, and generating meaningful insights to inform decision-making and ultimately improve CX.
Market research was the easiest way to gather customer feedback strategically, rather than listening to anecdotal stories from sales or contact centre agents. Amongst other tasks, this is still part of the responsibility of an insights team.
However, with technology and data analytics capabilities rapidly evolving over the past decade, we are now able to capture customer feedback straight from the horse’s mouth – the customer operations teams. This is where customers get in touch with us to ask questions or get help to resolve issues.
CRM and contact centre solutions were previously designed to fulfil a very operational purpose: enable customer interactions, respond to queries and resolve them (hopefully). That’s what they were designed for. Getting reports or any meaningful insights from those systems was hard. Hard to impossible.
Luckily, there is now new technologies available and we have recognised the importance of customer-centricity, and the value that lies hidden in customer feedback.
A large variety of CX software vendors have emerged, enabling organisations to listen to customer feedback without the help of traditional market research.
Most large organisations now have customer feedback programs, triggering a (hopefully) short survey to their customers after an interaction. Unfortunately, these can often be operationally driven, rather than strategically designed. It’s a bit like an IT team trying to design customer experiences; we have the technological capabilities to do the job, but not quite the skills needed to delight the customer.
It’s a similar situation when we’re looking at how customer ops teams capture customer feedback. They’re in a great space as they’re so very close to the customer, it’s easy to trigger a survey based on an interaction (thanks to new technologies), but we’re often missing the link between operations and strategy with regards to customer feedback measures and what questions we’re asking.
NPS. We all know about NPS… The most widely used customer experience measure, but unfortunately also the most severely misunderstood and misused measure.
My favourite quote by Fred Reichheld from a Wall Street Journal article sums it up nicely:
“I had no idea how many people would mess with the score to bend it, to make it serve their selfish objectives”.
NPS is a great measure, for various reasons. It’s a great benchmark measure for overall business health, and extremely valuable and meaningful in a traditional market research environment. It is not, however, a great measure for contact centre interactions.
Imagine you’re having an issue you need help with and after trying multiple things you’ve become upset and frustrated. You have to get in touch with the organisation’s call centre. Navigating through a long and confusing IVR system, being on hold while listening to elevator music, until you finally reach a human being, just to be put in hold again to be transferred to another operator…
We’ve all been there. And to top it off, you get asked to please fill in a short feedback survey. Instead of focusing on the experience you just had, how you felt, what the organisation could improve on to make things easier for you, etc. you get asked how likely you are to recommend that organisation…
Firstly, instead of being a customer-centric questions it’s business focused, and secondly, what information does the organisation actually capture here? Touchpoint NPS. An oxymoron.
NPS is a long-term brand health and loyalty measure. Not a touchpoint measure. The likelihood to recommend a brand isn’t built on a single interaction. It’s built over time, over multiple interactions and touchpoints, and even includes word of mouth.
You don’t decide in a single interaction with a call centre agent whether you would recommend a brand and its products and services. So what are we measuring here? We’re measuring how satisfied the customer was with the interaction they just had with a call centre agent.
So why do so many organisations use NPS for this? Because it’s the metric they know. They’ve heard of it, everyone does it, so they’re doing it too. It gives them a number to track. Not necessarily the right number, but a number.
If we look into customer experience measures in more detail, we quickly see that NPS isn’t quite the right question to ask in a contact centre environment. We bend it to be something it isn’t, particularly in the case of touchpoint NPS.
NPS, CSAT (Customer Satisfaction) and CES (Customer Effort Score) are the 3 most frequently used customer feedback measures. They are complementary in nature, as they measure slightly different aspects of the customer experience.
CSAT measures customer satisfaction with a product or service; CES measures effort required when interacting with an organization (staff member or digital); while NPS measures customer loyalty to an organization.
CSAT and CES should be used to explain the overall NPS number in more detail. While NPS is a very rigid metric that measures overall brand health. CSAT and CES are more specific measures that can be adapted to a given situation, e.g. product reviews, service interactions with call centre agents, website interactions and store visits, etc.
NPS is a long-term measure, as customers don’t base a recommendation on a single interaction. A history of interactions, touchpoints, word of mouth, etc. form a brand perception and the likelihood for a person to recommend a brand. All those interactions and touchpoints that form a customer’s recommendation can be tracked and measured through CSAT and CES.
By gathering more specific CSAT and CES feedback, we are able to explain the overall NPS number in more detail. And customers feel more valued as we ask them a question relating to their experience, not a question that suits business needs.
So let’s stop blindly following the blind and start thinking about what it is that we’re actually trying to find out with our little feedback surveys!
Please join the discussion and share your thoughts in the comments, or you can reach me on my LinkedIn below.