A Christmas Message to Our Customers

We’re coming to the end of another year and all in all, it’s been a good one for us here at Deep-Insight.

We are an Irish company – and proud of it – but our client base is international. Over the past 12 months, we have carried out customer and employee assessments in the UK, Netherlands, Poland and Australia as well as in our home market. That said, one thing that I have noticed in 2015 is a marked increase in activity from local companies. The Irish recovery is definitely under way.

NEW FACES

 

Jamie Jaggernauth

We’ve had a particularly busy year at Deep-Insight and as a consequence, there are a few new faces in the Cork office these days.

We’re delighted to welcome Jamie Jaggernauth, who is our latest addition to the Deep-Insight team.

Jamie hails from Trinidad and has worked in a variety of research roles in the Caribbean, UK and Ireland before joining Deep-Insight.

NEW CLIENTS

We also added a few new names to our client list during 2015, ranging from large well-established firms like the health insurer VitalityHealth in the UK to newer digital organisations like DoneDeal, which is Ireland’s biggest classifieds site (and which is part of the Norwegian-headquartered Schibsted Media Group).

In fact, DoneDeal celebrated its 10th birthday this year so a big happy birthday to John, Cathal, Kristian, Simon and the rest of the DoneDeal crew!

DEEP-INSIGHT’S OWN CUSTOMER ASSESSMENT

Earlier this month, we asked you what you thought of your relationship with us.

Now, I must admit I awaited these result with some trepidation. We think we deliver an excellent service to our clients but it’s always slightly scary waiting to hear what people ACTUALLY say about us and the benefits of working with Deep-Insight. It’s scary because we do take your feedback personally and it’s always a little nerve-wracking waiting for the results ton come through.

Last year, we had a Customer Relationship Quality (CRQ) score of 5.2 and a Net Promoter Score (NPS) of 17%. I was a little disappointed with those scores last year as they were down from the scores we received on our previous assessment and I felt that we could – and should – have done better.

A NPS of 17% is above average (more on average and good Net Promoter Scores here) but frankly it’s not that much above average. We had significantly higher scores in the past and we don’t see ourselves as a “slightly better than average” company. We used to be regarded by our clients as ‘Unique’ but in 2014 we dropped out of that zone. It’s a bit like a restaurant losing its Michelin Star – I was extremely keen to see if we could get back into the top bracket again this year.

So how did we do? How did you rate us?

2015 CLIENT FEEDBACK

 

This year, you gave us a Customer Relationship Quality (CRQ) score of 5.7 and a Net Promoter Score (NPS) of +37%.

I was a little stunned when the results came through as our NPS and CRQ scores had shot up dramatically. As many of you will be used to hearing me say at this stage, it’s quite a challenge to get your CRQ score to jump by more than 0.2 or your NPS to increase more than 10%. We had worked hard on a number of fronts over the past 12 months but the size of the improvement in scores still came as a surprise. So thank you for that vote of confidence in Deep-Insight – it really does mean a lot to us.

UniquenessThe other thing I’m particularly pleased with is the fact that we are back into the ‘Unique’ zone – that’s the light green box in the top right hand corner of the graphic. To be seen as unique, a company has to be able to provide a solution that truly solves its customers’ problems, as well as providing an excellent experience for that client. That’s something that only 10% of B2B companies achieve so it’s nice to be able to claim that accolade again.

There’s still plenty for us to work on. We’re currently analysing each and every verbatim to figure out exactly how to improve our service even further. We will be sharing these results with you as early as we can in the New Year.

LOOKING FORWARD TO 2016

So there it is. 2015 is nearly over but we have some exciting things planned for next year.

Over the past few months, Rose Murphy has been talking to most of you about what you like and dislike about our current product offering. The feedback you have given to Rose, as well as the various suggestions you have made in this recent client assessment, will help us improve what we do and how we do it.

But for the moment, allow me to say a big thank you to each and every one of you for supporting us throughout 2015.

On a personal note, I’d also like to say a big thanks to the following (in no particular order other than alphabetical): Brian, Frank, Grainne, Jamie, Mark, Mary, Peter, Pim, Rose, Yvonne as well as to the rest of the wider Deep-Insight team who have helped to deliver a fantastic service to you – our clients – over the past 12 months.

Have a very peaceful Christmas and I look forward to seeing you all in the New Year,

John

Are you going to NPS me? Yes, I am!

This is the topic of a talk I’m giving this week at a conference in Melbourne. It is in response to another talk entitled “Are you going to NPS me? No I’m not” in which Dr Dave Stewart of Marketing Decision Analysis will be presenting the case that Net Promoter is a deeply flawed concept, and should be discarded by organisations that espouse customer advocacy. To be honest, Dave’s position is probably close to what I thought of the Net Promoter Score concept when it was first introduced by a pretty smart academic and business consultant called Fred Reichheld back in 2003. Reichheld’s basic premise was that you only need to ask one question in order to understand if a customer is going to stay loyal to you or not: “How likely are you to recommend us to a friend or colleague?”

Fred, being the excellent marketeer that he is, proclaimed the benefits of this Net Promoter Score (NPS) concept in respected publications like the Harvard Business Review and then in his own book The Ultimate Question which came out in 2006, shortly after I took on the CEO role here at Deep-Insight. Since then, NPS has became very popular as a customer loyalty metric. However, it has also attracted some heavy criticism – in particular from one researcher called Tim Keiningham who gave NPS a particularly scathing review saying that he and his research team could find no evidence for the claims made by Reichheld. (It should be said that Keiningham worked for the market research company Ipsos so his views may not be completely unbiased.)

At that time, my own view was that NPS was probably too simplistic a metric for business-to-business (B2B) companies. I also felt that Deep-Insight’s own customer methodology – which also included a ‘would you recommend’ question – was a much better fit for complex business relationships. And if I’m honest, there was an element of ‘Not Invented Here’ going on in our own organisation as well.

So we decided to ignore NPS.

But here’s the thing: our customers didn’t. When we ran customer feedback programmes for customers like Reed Elsevier and Atos in the UK, ABN AMRO in the Netherlands, Santander in Poland, and the Toll Group in Australia, they would all ask: “Can you add in the NPS question for us – we have to report the numbers back to headquarters?” Of course, being the good marketeers that we were, we duly obliged. However, we always gave the results back in a separate spreadsheet, so that it wouldn’t contaminate our own reports and our own wonderful methodology!

Roll the clock forward to 2013. NPS still hadn’t gone away. In fact it had become even more popular, particularly with large international companies where a simple understandable metric was needed to compare results across different divisions and geographical areas. And when I finally looked into it, I discovered that Deep-Insight had actually been gathering NPS data from customers across 86 different countries since 2006.

Around the same time we also did some research into our own database to find out what really drove loyalty and profitability in our clients. Now this is not an easy thing to do, as many of you who have tried will know. But where we had several years of customer feedback data, it was relatively straightforward to analyse how many of our clients’ B2B customers were still with them, and for those who have deliberately defected, we investigated if that defection could have been predicted by a poor Net Promoter Score or by any of the metrics in our own CRQ methodology.

I have to say that the results were quite interesting. It transpired that while a low ‘Likelihood To Recommend’ was not the BEST predictor of customer defection, it was actually a pretty good one. Deep-Insight’s overall Customer Relationship Quality (CRQ) metric was a slightly better predictor. A poor Commitment score – one of the key components of CRQ – was the best predictor of whether a B2B client was going to defect to the competition or not.

So there we had it: NPS did actually work.

It worked not because it’s the BEST predictor of whether a client was going to defect, but because it’s a GOOD predictor, coupled with the fact that NPS has been embraced by some of the world’s leading organisations as an easy-to-use and internationally-accepted customer benchmark. At Deep-Insight, we may have come a little late to the party – we only incorporated the Net Promoter Score into our customer methodology in early-2014 – but we have found the combination of NPS and our own CRQ metrics works really well for our clients.

Now let’s go back to the cartoon at the top of the blog (and thank you Tom Fishburne for allowing us to use it). Surely if there’s is a statistically purer methodology than NPS, why not use that instead?

The answer is simple: most senior executives aren’t interested in re-inventing the wheel. They are much more interested in taking the feedback from their clients and acting on it, so that they can protect and enhance the revenues they get from those clients.

So for those B2B executives who are wondering if NPS is the right customer metric for them or not, I would suggest that you’re asking the wrong question. What good CEOs and Sales Directors are asking these days is:

“If my Net Promoter Score is low or if I have a lot of Opponents and Stalkers as clients, what do I do?”

In fact, the really successful CEOs and Sales Directors are spending the time thinking about the challenges of putting a really effective customer experience (CX) programme in place, rather than worrying about the purity of the metrics. That’s what you should be doing too.

 

5 Things To Remember To Get Your Completion Rates Up

One of the questions we get asked a lot is: “What sort of completion rates do you guys normally get on an assessment?”

Well, the answer is that it depends on what sort of assessment you’re talking about – we provide feedback on relationships with customers, channel partners and suppliers, and the completion rates differ from one type of assessment to the next:

-For employee assessments, our typical completion rate is in excess of 90%.

-For corporate customer and channel partner assessments, it’s typically 35-40%.

-For supplier assessments, the average completion rate are somewhere in the middle: 60-70%.

The next question we get asked is “Is it really that high?”

Well, we mainly get asked that question in connection with customer assessments, as some of our clients think 35-40% sounds impressive. This is particularly the case when people compare our figures to the ones you might get on a typical consumer surveys, where sometimes as few as 2% of consumers will bother to complete a questionnaire (Petchenik & Watermolen, 2011).

Remember that we are talking about existing, often long-standing, business-to-business (B2B) relationships – that’s what we do at Deep-Insight. We’re not a consumer research company. In fact, we’re not even a market research company, although we often are compared to firms like TNS or Gallup. We’re different. We look at – and assess – the quality of the relationships that large companies have with their biggest B2B clients. And if you think about it, why would good customers NOT want to provide feedback on their relationship with you, particularly if their account manager has convinced them that it’s an important part of their ongoing customer feedback process, and that their input is genuinely used to help improve the service given not just to them but to all clients?

The 5 pieces of advice I give to our clients are:

1. Spend Time Getting A Good Contact List Ready.

Most of our clients tell us they can pull together a list of key client contacts in a week. Two at the most. Our experience tells us that it takes at least 4-6 weeks to come up with a really good clean list of customer contacts who have a strong view of their relationship with our client. If the list isn’t compiled properly, we end up polling the views of people who really don’t have a strong view on the company, and who won’t be interested in responding.

2. Pre-Sell The Assessment To Customers.

One of our clients has been achieving customer completion rates in excess of 70% on a consistent basis for the past number of years. It does this because the CEO – together with the account managers – has managed to convince his key accounts that the 10-15 minutes they invest in providing feedback WILL result in a better service. “Tell me what’s wrong, and I promise we’ll do our best to fix it.”

3. Make Sure to Contact Customers While The Assessment Is Live.

We normally hold our assessments open for two weeks and we know from experience that if account managers have been properly briefed to mention the assessment in every conversation they have with a client during those two weeks, the completion rates will improve dramatically.

4. Manage The Campaign Smartly.

This is not rocket science, but you would be amazed at the number of companies that want to run assessments over school holiday periods, or during particular times of the year that may coincide with the most most busy time of the year for their customers. Plan your launch dates in advance, and think about the timing for issuing reminders. We usually recommend launching a customer assessment on a Tuesday morning, with the final reminder going out on the Tuesday two weeks later. That means that even if somebody is out of the office for two weeks, they’ll still have an opportunity to provide feedback.

5. Don’t Panic At The End of Week 1.

We normally see a flurry of activity during the first six or eight hours of a B2B campaign and typically the completion rate after Day 1 is about 8%. At the end of the first week (before we send out a first reminder) it’s often the case that the response rate hasn’t broken through the 10% barrier. This is not unusual. Completion rates will increase and a message in the final reminder that “This assessment is closing today” usually elicits a final flurry of responses!

As I said, a lot of this isn’t rocket science but it does require a bit of advance planning. If you do put the effort in up-front, you’ll see it rewarded in significantly higher completion rates.

What is a ‘Good’ B2B Net Promoter Score?

SO WHAT’S A GOOD NET PROMOTER SCORE* FOR A B2B COMPANY?

It’s a question we get asked a lot. Sometimes the question comes in a slightly different form: “What NPS target should we set for the company? 25% seems low, so maybe 50%? Or should we push the boat out and aim for 70%?”

Well, it all depends. On a number of different factors. As we mentioned in an earlier blog, it can even depend on factors such as whether your customers are American or European.

We can’t state often enough how crucial it is to understand how these various factors (we’ll discuss them in detail below) impact the overall Net Promoter Score you receive, as the NPS calculation makes it incredibly sensitive to small changes in individual customer scores. Be aware of these factors when deciding on a realistic NPS figure to aim for.

HOW IS THE NET PROMOTER SCORE CALCULATED?

For the uninitiated, a company’s Net Promoter Score is based on the answers its customers give to a single question: “On a scale of 0 to 10, how likely are you to recommend Company X to a friend or colleague?” Customers who score 9 or 10 are called ‘Promoters’. Those who score 7 or 8 are ‘Passives’ while any customer who gives you a score of 6 or below is a ‘Detractor’. The actual NPS calculation is:

Net Promoter Score = The % of Promoters minus the % of Detractors

Theoretically, companies can have a Net Promoter Score ranging from -100% to +100%.

 
Does NPS Work for B2B Companies
 

Most Europeans consider a score of 8 out of 10 as a pretty positive endorsement of any B2B product or service provider, but in the NPS world, a person who scores you 8 is a ‘Passive’ and therefore gets ignored when calculating the Net Promoter Score (see box above).

Here’s the thing. If you can persuade a few of your better customers to give you 9 instead of 8, then suddenly you’ve boosted your Promoter numbers significantly. We know more than a handful of account managers who carefully explain to their clients that 8/10 is of no value to them whatsoever and that if they appreciate the service they are getting they really do need to score 9 or 10. Sure, there’s always a little ‘gaming’ that goes on in client feedback forms, particularly when performance-related bonuses are dependent on the scores. However, we find it intriguing to see the level of ‘client education’ that account managers engage in when the annual NPS survey gets sent out!
 

What Factors Impact Your Net Promoter Score?

We said at the outset that the Net Promoter Score you achieve is dependent on a number of factors. So what are they?

1. Which geographical region do your customers come from?
We’ve covered this point in an earlier discussion with Professor Anne-Wil Harzing – Americans will score higher than Europeans – probably 10% higher and possibly even more.

2. Do you conduct NPS surveys by telephone or face-to-face or by email?
In the UK and Ireland, we don’t like giving bad news – certainly not in a face-to-face (F2F) discussion. Even if we’re talking over the phone, we tend to modify our answers to soften the blow if the feedback is negative. Result: scores are often inflated. In our experience, online assessments give more honest feedback but can result in scores that are at least 10% lower than in telephone or F2F surveys. This gap can be smaller in countries like the Netherlands and Australia where conversations and customer feedback can be more robust. It’s a cultural thing.

3. Is the survey confidential?
Back to the point about culture – it’s easier to give honest feedback if you have the choice of doing so confidentially, particularly if the customer experience has been negative and you have a harsh message to deliver to your service or product provider. Surveys that are not confidential tend to give a rosier picture of the relationship than those that are confidential.

4. Is there a governance structure in place to determine which clients (and which individuals in those client companies) are included in the survey?
At Deep-Insight, we advocate a census approach when it comes to customer feedback: every B2B customer above a certain size MUST be included in the assessment. No ifs or buts. Yet we are often amazed by the number of companies that allow exceptions such as “We’re at a very sensitive stage of discussions with Client X so we’re not going to include them on the list this year”or “We’ve just had a major delivery problem at Client Y – they certainly won’t appreciate us asking them now what they think of us”. In many cases, it’s more blatant – customers are excluded simply because everybody knows they are going to give poor feedback and pull down the overall scores. In some cases, it’s a little more subtle, particularly where it’s left to the account manager to decide which individuals to survey in a particular account. A proper governance structure is required to ensure ‘gaming’ is kept to a minimum and that the assessment process has credibility. If a company surveys its Top 100 accounts annually, senior management must be given the final say over which clients are added to or taken off the list. It’s not feasible to have the MD to approve every single client, but at least make sure the MD understands which of the major accounts – and which individuals in those accounts – are to be included on the list.

5. Is the survey carried out by an independent third party, or is it an in-house survey?
In-house surveys can be cost-effective but suffer from a number of drawbacks that generally tend to inflate the scores. For starters, in-house surveys are rarely seen as confidential, and are more prone to ‘gaming’ than surveys that are run by an independent third party. We have seen cases where in-house surveys have been replaced by external providers and the NPS scores have dropped by a whopping 30% or more. Seriously, the differences are that significant.
 

So What Is a Good Score?

Now, coming back to the question of what constitutes a good Net Promoter Score in a B2B environment, here’s our take on it.

Despite the claims that one hears at conferences and at the water coolers that “we achieved 52% in our last NPS survey” or “we should be setting the bar higher – the NPS target for next year is going to be 60%” these types of score are rarely if ever achieved. We’ve been collecting NPS data for B2B clients since 2006 and we have customer feedback from clients across 86 different countries. Our experience is that in a well-run, properly-governed independent confidential assessment, a Net Promoter Score of 50% or more is almost impossible to achieve. Think about it. To get 50%, you need a profile like the one below, where a significant majority of responses are 9 or 10 and most of the others are pretty close to that level. In Europe, that simply doesn’t happen.

Our experience of B2B assessments is that a Net Promoter Score of +30% is truly excellent, and that means you are seen as ‘Unique’ by your customers.

A Net Promoter Score of around +10% is par for the course – consider that an average score.

A negative NPS is not unusual – approximately one third of our B2B customers are in negative territory and one in ten of our clients score -30% or even lower.

In fairness, Deep-Insight’s customer base is predominately European or Australian so we also need to be careful about how we benchmark different divisions within the same company that are in different regions or markets.

In our opinion, the best benchmark – for a company, business unit or division – is last year’s score. If your NPS is higher this year than it was last year, and nothing else has changed, then you’re moving in the right direction. And if your NPS was positive last year, and is even more positive this year, happy days!
 

* Net Promoter® and NPS® are registered trademarks and Net Promoter SystemSM and Net Promoter ScoreSM are trademarks of Bain & Company, Satmetrix Systems and Fred Reichheld