Having short surveys is really important, but all too often they are long, tedious and poorly designed. This blog explains why it’s so important and how to combat the urge to add more questions.
We’ve all been there. Invited to respond to a survey and you decide to take part. You are filled with optimism that it’ll be a quick and straightforward experience. But then, oh no, you realise it’s a never ending nightmare, question after question, page after page. You wonder if the thing will ever end. Then you lose the will and abandon (or make up answers and skip through as fast as possible).
We can all give examples of really long and poorly designed customer satisfaction surveys (a large, well known budget hotel chain is particularly bad for this). And we would collectively agree they are a bad thing, and a really bad experience for the customer. But we still see them, sent out by companies and professionals who really ought to know better.
So what’s going on here? I can’t believe anyone thinks “I know, let’s create the longest, most tedious and annoying survey ever and send it out to our beloved customers, that’ll really be a good thing to do”. They too will have experienced long, painful surveys, so why are they inflicting this on their own customers?
It starts out with all the right intentions. We want to ask our customers what they think of us, and what could we do to improve. This can be achieved in a small handful of questions. Even asking about particular aspects of the relationship, this can be done through a small number of additional questions.
But then some sort of crazy curiosity seems to take hold, and common sense vanishes. “If we’re engaging with our customers to ask them something we can’t let the opportunity pass us by, and we must ask them about X,Y and Z too”. Even worse, other colleagues and departments start to get involved too, and see the survey as a quest for data, for their own needs. They hit the survey owner with questions they want answers to, even if they have no intention of doing anything with the results, or it has no benefit to the customer.
Quite quickly the survey length and customer experience is forgotten. People obsess on the survey and demand for data, whilst they should be obsessing on the customer and what it’ll be like for them when they respond (and what they can do to make customers happier).
The end result is a bloated mess. Too many topics are shoehorned in and the customer is left bewildered, wondering where the survey is going next. Worse still is the use of conditional logic, where customers are penalised for answering a question a particular way, only to be then bombarded with additional questions as a follow-up. I’m sure I’m not the only person who has clicked back to change their answer to then avoid these extra questions.
As it becomes apparent to customers the survey is longer than they expected, it is not uncommon to select random answers without reading the question, just in a bid to reach the end. Either that or they close the survey altogether.
The outcomes are not good. The customer is left feeling disappointed or annoyed that the survey was painful and took longer than expected. Consequently this can have an impact on future response rates as customers will be less likely to participate in future. Also any issues or cries for help are ignored because (ironically) getting more data from having more questions makes it feel like it would be an impossible job for anyone to read and respond appropriately to every piece of feedback.
And not only is it a bad experience for customers, but the company is left with a poor response rate, inaccurate data and/or data they are unlikely to do anything with.
Just to be clear, I’m not saying all market research is bad. Engaging with customers to find out what they think on different topics is generally a good thing to do (although there are pitfalls to avoid).
However, it’s important companies consider the method they use and the impact on the customer. If they want to measure and improve satisfaction, a very short customer satisfaction survey is the best thing to do. If they want to find out what people think of their website for example, that’s fine, but consider qualitative research or a separate standalone survey.
So when you design your next customer satisfaction survey, resist the devil on your shoulder telling you to add more questions and think instead about how you can make the survey a great experience for customers (which not surprisingly is a much better way to increase your response rates and the quality of your data). And should other departments come knocking because they want to ‘add a couple of things to your survey’, be brave and push back. Not to be awkward, but because it’s the right thing for the customer, and they should always remain front and centre.
Read our helpful guide on what VoC questions to ask and how many here.
At CustomerSure, we’re experts on building engaging customer satisfaction surveys and obtaining useful, high quality feedback. Check out our customer satisfaction survey software or get in touch to find out more.
Ready to elevate your VoC programme and ensure success using our expert guide? Learn the three foundations required for success.
Discover more »Connect with a CX expert who’ll help determine your current VoC programme maturity level and provide a 3-step action plan to improve.