Continual improvement and striving for excellence should be at the heart of every organisation. As a provider of business improvement training and a proponent of Quality systems and tools, we wanted to ensure that we weren’t taking for granted our own improvement system and looked for ways to better understand our customers and what makes them happy.
Over the years we’ve become accustomed to collecting paper feedback forms at the end of a training course. These asked delegates questions on all areas of their experience, from the pre-course booking process to course delivery and learning outcomes.
We asked lots of questions and gathered lots of data. The challenges we faced were knowing where to focus our efforts and addressing the fact that delegates had to hand the forms back to a tutor who they may have wished to provide feedback on.
The Ultimate Question
A colleague here suggested the Net Promoter Score (NPS) system as a widely adopted and effective feedback process.
The landmark book The Ultimate Question 2.0 written by Fred Reichheld and Rob Markey sees NPS as having “developed into a full-fledged management system with an ethos that rivals Six Sigma in its power”. A bold statement, so we decided to explore further.
It describes NPS as a customer centric ‘closed-loop’ feedback system, meaning you act upon what you learn and where possible, seek further feedback and report back to customers with the actions that you take. By asking customers how likely they are to recommend your products or services using a numerical scale (usually 0-10), you can quickly identify your promoters (those happy to recommend you), your detractors (those who would recommend against), and passives (neither promoters or detractors).
In contrast with other generic online feedback surveys, NPS can identify trends with minimal effort from customers and help focus attention on the areas that have the biggest impact on their experience.
Subtracting the percentage of detractors from the percentage of promoters gives you a Net Promoter Score. This in theory becomes your high-level measure of the satisfaction of your customers and indicates the likelihood that customers will recommend you to others.
Any score above 0 is considered good, above 50 is excellent and over 70 is thought of as world class.
After further research and deciding that an NPS feedback system was worth a go, we had to choose which questions to ask our customers. We agreed on the following:
- On a scale of 0 to 10, how likely is it that you would recommend Bywater to a friend or colleague?
- What most exceeded your expectations?
- What fell short of your expectations?
The survey was tested following a small number of training courses to begin with. Having been used to asking the volume and types of questions we did with a 100% response rate using the paper forms, we were understandably cautious.
We wanted to know how many delegates would reply to a 3-question email, what level of detail the answers would provide, and how we’d collate, investigate and report the findings back to the business.
To help with our analysis, we grouped the answers into the following categories:
- The ability to apply what you have leant
- Course content
- Booking & course administration
- Something else.
Initially we gave delegates the option to select ‘nothing’, but too many delegates that had positive experiences chose this when asked what fell short, giving us no meaningful insight into the weakest areas of their overall experience. We quickly removed that option!
Finally, we went to great lengths to explain to our tutors the importance and purpose of this new NPS feedback system. Not only did we want to ensure that they didn’t feel threatened by the changes, we knew that they were critical to the success of the project. Without the tutors positioning the survey correctly to the delegates, the likelihood of it being completed when it arrived in their inbox days after their training course would diminish greatly.
The feedback so far has certainly been more detailed than the paper forms provided and it has helped us focus on the issues that matter.
Every ‘detractor’ identified by the survey is given a call within one working day, to gain more information about what went wrong and to see if we can address their concerns. This is important, not only for us but for the individual involved, as we want to demonstrate that their feedback is valued and that it will be acted upon.
A piece of feedback received early in our NPS implementation helped us to refine our post-course processes. A delegate struggled to say how likely they were to recommend us because they wanted to know their exam result. A failed exam might mean they wouldn’t recommend us, even though the rest of their experience was good. Because of this, we now send the survey to delegates after results have been issued and have greatly reduced the time it takes to send delegates their results and certificates.
We’ve also identified a trend of slightly more negative comments relating to the ‘venue’ when the training course takes place at the client’s premises, rather than at our public training venues.
As a result, we challenged the idea that ‘in-house’ training had to mean being delivered at a client’s premises or a venue they’ve chosen and we now offer customers the option to use our public training venues to host their event.
Each piece of feedback still needs investigating by performing our own root cause analysis and by thinking for ourselves. This isn’t to suggest that customer feedback is incorrect, far from it. But it may be that the solution to address an underlying cause lies elsewhere.
A re-quote near back of Fred Reichheld’s book by Henry Ford puts it this way:
“if I had asked my customers what they really wanted in the future, they would have told me, ‘A faster horse.’”
We’re still on a journey with NPS and in the short-term, our objective is to increase response rates so we have more opportunities for conversations with our customers and are able to act.
Everyone involved should remember that the goal of NPS is continual improvement I encourage anyone reading this who receives a survey to take the opportunity to have your say.
This article was written by David Cole, a Director at Bywater.