Embracing red metrics: putting the focus on actionable analytics
No one wants to present a negative report. It can feel intimidating. It’s much easier to focus on positive metrics, and present data that, at least on the surface, makes it look like ‘we’ are performing well.
This is what I call watermelon syndrome - presenting a green scorecard to your stakeholders that falls apart under questioning, revealing the red metrics inside.
This watermelon style of reporting serves no one. Relying on vanity metrics, setting the wrong benchmarks, or selectively reporting on green metrics makes it very difficult to understand what’s really going on.
In this post, I’ll break down the value of negative CX metrics, and look at how you can set the right measures to ensure your voice of customer program delivers for your company.
Watermelon syndrome - looking beyond green scorecards
Negative metrics are an incredibly valuable tool when looking to gain insight from your CX and operational data. But we’re conditioned to think that success equals a green scorecard.
Over the years, I’ve found myself saying more often to my teams that it's okay to present reports, scorecards, decks or dashboards that are red if it's the reality. All data in customer experience is a gift, as this is what customers are really thinking and experiencing. If the story it tells is negative, be receptive to it!
Don’t present your NPS as green because you’ve moved it a little bit, when actually it’s really red. The red is valuable because it’s what you can take action on as a company.
Presenting red metrics can feel intimidating, so here are a few tips:
- Build a narrative around your red numbers - be ready to communicate the context.
- Present from the customers perspective, not that of the organization. For example “the customer was on hold for 6 minutes, with no contact from our teams after they had completed their account verification.”
- Anticipate questions from your stakeholders. What are the benefits of each red number? What actions do you recommend based on this data?
- Be prepared to get granular. I prefer to spend 80% of my presentation talking about why our numbers are red, and more importantly, what we're going to do about it.
- Focus on looking forwards, and learning from the past.
If you only have numbers on a dashboard, you’re not going to be able to provide the context to build a compelling narrative. You need to have tools in place to integrate operational data, demographical data and more - additional context that completes the picture.
For example, say your core metric is NPS. You currently have 40% detractors. This is, or should be seen as a red metric, but reporting on it by itself is useless. You can’t take action on this number alone.
Now add context. Perhaps all of your detractors are from large enterprises. Maybe they all have a specific product add on. Or perhaps the majority spoke with customer support team members that have less than three months tenure. Your narrative is dependent on how you link operational data to your customer experience data. You are looking for causation, not just correlation! Doing this also allows you to build actionable next steps.
Remember, the metrics you’re reporting on are already in the rearview mirror. You know what recent actions you’ve taken as a company or team. Your metrics directly reflect on those actions. Use your learnings to look forward through the windscreen, to inform how you’re going to avoid another quarter of the same metrics. You also need to keep doing what worked, and normalize this.
How are you moving from amber to green, or from red to green? If your metrics are green all the way through, how are you going to keep them that way?
Whether your numbers are green, amber or red, the narrative needs to be there. It’s up to leadership to foster psychological safety around whatever your data says, so you feel free to report the reality - and actually do something about it.
Setting the right measures
So how can you make sure you don’t fall into the watermelon trap? It starts with setting the right measures. This should be led by respective team leaders and stakeholders, with input from the CX and insights team.
First, know what you want to do with the data. You can’t just be looking to prove yourself right, or track arbitrary measures. There needs to be a data strategy in place that can be validated.
Think of a conventional contact center, where a service level agreement is in effect.
The contact center leader can report to senior leadership that they are meeting all of the specified metrics in the SLA. Their metrics are green!
But how was the SLA devised? Was the service level designed for the business or for the customer? Is the approach inside out, or outside in?
An inside out metric could be to answer 50% of calls within five minutes. The measure has been set because the contact center is resourced to achieve it.
Now 50% of customers waiting five minutes to get served is bad. But the remaining 50% are waiting more than five minutes, which is really bad! It still looks green on the contact center scorecard, because they’re achieving the measure set.
An outside in view is the customer’s perspective. They expect their call to be answered within 30 seconds. Not surprisingly, this is driving NPS down - so you end up with green operational metrics, and an NPS that is not industry leading or acceptable. It doesn’t hold up under scrutiny.
How to validate your measures
To validate the measures set, try the ‘five whys’ test. Can you and your stakeholders get five whys down into the data?
Using the call center example again, the call center leader reports that the service level is green.
The first why would be “why is it green?” This one’s easy to answer - it’s because 50% of calls were answered within five minutes.
The second why is “why is 50% of customers the right percentage?”
The third why, “why is five minutes the right number?”
The fourth, “so why do we have this measure in place?”
And the fifth, “so why do you believe we have a green service level?”
It can be really hard to get frontline leaders and teams to go past the first why of what a measured number means to them. Very often they've created a number that sounds reasonable, or chosen one they’re confident can be met.
Leaders need to be working with their teams to define realistic metrics that will drive change and better outcomes (across customers, employees and financials) rather than set arbitrary numbers to report on that are not actionable.
If your NPS is not moving upwards, then these measures come under fire. Here’s where data analysts can add context, to validate or invalidate the measure set.
Analysts can answer:
- What is the profile of passive customers compared to detractors?
- Are customers commenting on call wait times?
- What percentage of our customer base is affected? Are any other issues commonly mentioned in conjunction with call wait times?
- Will improving call wait times impact NPS? How much?
Even if a measure stands up to the ‘five whys’, it doesn’t mean it’s set in stone. Ongoing data analysis can show when a measure needs review, to further improve customer satisfaction and increase your NPS.
Presenting negative data is intimidating. It can feel like you’re failing, which is why we’re geared towards highlighting the green metrics amidst a sea of red.
But it’s important to remember that CX success comes from using all of your data, especially the red metrics that highlight customer issues and pain points. After all, how can you fix what's broken if you don't know it exists? Or more importantly, aren't willing to act on it?
The next time you're looking at your metrics dashboard, take a step back and view the whole picture. Do you know the questions your data needs to answer? Do you know what you want to do with the data? Start here.
About the author:
Scott Downing is a Global Employee and Customer Experience strategist, and an advocate for Continuous Improvement. He has experience across the SaaS, Telco and Financial Service sectors in multiple regions and countries. He uses Human Centric Design methods in his work.
Recently, Scott helped a tech Unicorn kick off its CX Strategy and Transformation Program to ensure continued success. He has also led multiple International Teams for large organisations like Telstra and Intuit. In these roles Scott was renowned for focussing on Customer Support, Digital OMNI channel launches/optimization as well as driving a focus for quality and people development to move the needle.
Scott is originally from the UK and has been in Australia for 15 years. He is the proud Winner of the Customer Service Executive of the Year in Australia and Internationally.
You can get in touch with Scott at firstname.lastname@example.org