Note: This article was originally posted in 2012. Best Buy no longer uses this survey, but this poor design is still used by companies who insist on extra-long surveys.
How do you know when a company’s on the ropes? Some observers watch cash flow. Others look at turnover. Me? I look at how a company treats its customers. When a company’s customer experience starts to drop, it’s time to sell the stock. I’m afraid that may have happened at Best Buy, especially when I look at their new customer satisfaction survey.
Customer satisfaction surveys are critical for creating your customer experience. A great survey puts your customer at the center of your customer experience design, allowing you to learn and improve as you go. But this only works when you design the survey from a perspective of customer respect. When your customer satisfaction survey design assumes your customers aren’t paying attention to the survey, then why bother? In the past, Best Buy’s culture was centered on the customer experience. But their recent update to their customer satisfaction survey shows that at least one group thinks their customers are unworthy of respect.
What do you think when you read this survey question?
I visited Best Buy for Black Friday and wanted to share my subpar experience (2 hours in line AFTER I entered the store!) on their customer satisfaction survey. Although I didn’t “credit” them, Best Buy’s previous survey made it into my customer satisfaction survey hall of shame (https://heartofthecustomer.com/3-principles/). In that post, I argued for three customer experience survey principles:
Best Buy’s previous survey had over 50 required questions, including more than a dozen on areas that had nothing to do with my customer experience (“30: “I cannot live without the Internet”). I was pleased to discover that their new survey improved on principles 2 and 3, focusing more tightly on my customer experience.
But then I hit the “Please select the number 4 below” question.
This is a survey design trick to ensure respondents are paying attention. If the answer is not 4, you ignore their results. And it works. But why resort to tricking your customers in the first place?
This lack of respect for the Best Buy customer is apparent throughout the survey. It still fails on principle #1, as it is 24 pages long! I didn’t count questions this time, but it is easily 50. Even worse, there was a link asking me if I would take even more questions!
Where did this go so wrong? It seems like the approach was to ask everybody for survey questions. When the list kept growing somebody said, “Wow, this is really long. We need to do something to make sure that people are paying attention, and not just selecting 5 for each question.” So they added this.
Rather than resorting to tricks, doesn’t it make more sense to question your approach in the first place? A true customer-centricity approach questions the survey, not the customer.
When you respect your customer, your surveys are targeted to main topic – in this case, the customer experience in their stores. We can debate the proper survey length, but few would argue for 24 screens of questions.
It’s sad, because Best Buy introduced the customer-centricity concept to retail. But that’s what happens in times of turmoil – teams focus more on accomplishing the task at hand (“We need a survey. Let’s get some questions”) than focusing on the reason for conducting a customer satisfaction survey in the first place: to serve the customer.
Customer satisfaction surveys are not about collecting data. They’re about creating a great customer experience. Unfortunately, it appears that Best Buy has forgotten this.