Does your satisfaction survey show disrespect for customers

Note: This article was originally posted in 2012. Best Buy no longer uses this survey, but this poor design is still used by companies who insist on extra-long surveys.

How do you know when a company’s on the ropes?  Some observers watch cash flow.  Others look at turnover.  Me?  I look at how a company treats its customers.  When a company’s customer experience starts to drop, it’s time to sell the stock. I’m afraid that may have happened at Best Buy, especially when I look at their new customer satisfaction survey.

Customer satisfaction surveys are critical for creating your customer experience.  A great survey puts your customer at the center of your customer experience design, allowing you to learn and improve as you go.  But this only works when you design the survey from a perspective of customer respect.  When your customer satisfaction survey design assumes your customers aren’t paying attention to the survey, then why bother?  In the past, Best Buy’s culture was centered on the customer experience. But their recent update to their customer satisfaction survey shows that at least one group thinks their customers are unworthy of respect.

What do you think when you read this survey question?

Some possible answers:

  • Huh?
  • Is this a mistake?
  • Wow, Best Buy is using advanced market research techniques to ensure the quality of their data
  • This survey designer clearly assumes I’m a doofus (technical term), and wants to trick me into admitting it.
——————-

How to Survey

I visited Best Buy for Black Friday and wanted to share my subpar experience (2 hours in line AFTER I entered the store!) on their customer satisfaction survey. Although I didn’t “credit” them, Best Buy’s previous survey made it into my customer satisfaction survey hall of shame (https://heartofthecustomer.com/3-principles/).  In that post, I argued for three customer experience survey principles:

  1. Make your survey short;
  2. If you ask a question, use it;
  3. Never ask a question when a query will do.

Best Buy’s previous survey had over 50 required questions, including more than a dozen on areas that had nothing to do with my customer experience (“30: “I cannot live without the Internet”).  I was pleased to discover that their new survey improved on principles 2 and 3, focusing more tightly on my customer experience.

But then I hit the “Please select the number 4 below” question.

This is a survey design trick to ensure respondents are paying attention.  If the answer is not 4, you ignore their results.  And it works.  But why resort to tricking your customers in the first place?

This lack of respect for the Best Buy customer is apparent throughout the survey.  It still fails on principle #1, as it is 24 pages long!  I didn’t count questions this time, but it is easily 50.  Even worse, there was a link asking me if I would take even more questions!

Where did this go so wrong?  It seems like the approach was to ask everybody for survey questions. When the list kept growing somebody said, “Wow, this is really long.  We need to do something to make sure that people are paying attention, and not just selecting 5 for each question.”  So they added this.

Rather than resorting to tricks, doesn’t it make more sense to question your approach in the first place?  A true customer-centricity approach questions the survey, not the customer.

When you respect your customer, your surveys are targeted to main topic – in this case, the customer experience in their stores.  We can debate the proper survey length, but few would argue for 24 screens of questions.

It’s sad, because Best Buy introduced the customer-centricity concept to retail.  But that’s what happens in times of turmoil – teams focus more on accomplishing the task at hand (“We need a survey. Let’s get some questions”) than focusing on the reason for conducting a customer satisfaction survey in the first place: to serve the customer.

Customer satisfaction surveys are not about collecting data.  They’re about creating a great customer experience.  Unfortunately, it appears that Best Buy has forgotten this.

12 replies
  1. Avatar
    Donna says:

    I completely agree with using a survey to guide your customer experience versus to just gather data.
    I too use three guiding principles when designing surveys for my customers.
    1. Every question has to have an owner of the responses
    2. Only ask questions you can take action on
    3. If you can find the information another way, don’t ask the customer

    Reply
    • Jim Tincher
      Jim Tincher says:

      Donna,

      I love your guidelines! #2 and #3 are very similar with mine, but whereas I don’t have an equivalent to your first rule, I definitely agree with it. You’ve probably experienced an issue similar to one I’ve encountered. You ask a question, you determine an issue – but nobody is willing or able to take action. Having an owner makes certain action is taken. Good job!

      Reply
  2. Avatar
    Richard R G Grate says:

    I recently was at a Best Buy store at PGA Blvd. Bad mistake, every time I go there, I keep telling my self, never again, but this
    time I mean it.
    I was going to buy the Sony Blu-ray player BDP-S3700, I was the only customer in that department. I walk over to two employee ‘s ,
    at the desk,one was out of uniform, eating his lunch, the other was counting product in a box. I wanted to ask them a simple
    question, but they said they were too busy. I put the Blue-ray player back on the shelf, went to the Target Store, found the very
    same Blu-ray player, for the same price.
    If your going to buy a product from Best Buy $500.00 or more, you might get some help, and I repeat might. Every time I’ve been
    there, it’s always the same service,none. I’m done with you Best-buy.

    Reply
    • Avatar
      Bill Peart says:

      Judging from my experiences at the Chico Cal. BB, I would say that you received decent service. You received a reply. even if it was ‘we are to busy.’ (at least they were honest)
      i was in BB monday 5:15 pm Jan. 9 ’17 looking to purchase a SSD for my computer. Website detailed several available in store for immediate purchase.
      To my (expected) amazement, there were
      Zero,
      Zip
      Nada
      empty spaces.
      I inquired to the “Associate” as to where they may be.
      after a precursory glance, I received the, much to be expected, one word grumbling unintelligent grunt
      “OUT”
      I tried to garner more information…..
      ….met with a very indignant,
      “Do you see any on the shelf, we’re out !!!”
      Talked to ‘manager’, went downhill from there.

      interesting side note, when manager checked store inventory on computer, it said 3 available for purchase.

      Reply
  3. Avatar
    Peter says:

    You seriously don’t know how that type question is used? Please research psychometry and online data mining and the use of computers and humans/organizations to skew company survey results.

    It is a basic reliability question to ensure neither a computer has been instructed to fill this out nor a person. How would it help Best Buy customers if a child for instance stumbled upon the survey and filled it out randomly. Yes, they should also know that 4 is 4 but there are a hundred other scenarios that make a question like that important and most generally there is a third party company handling the survey questions and answers. If I were putting a survey online of any importance, I’d require it but possibly make it less obvious.

    In the past I worked at Best Buy. I agree they should shorten surveys but will state that at our store survey results were read every day right before opening with positives and negatives shared and call outs for employees that were mentioned positively in surveys. Employees mentioned in a negative survey would be met with personally and asked about the situation and if valid, they would need to come up with a plan of action to assure their behavior would not cause another negative feedback. Our GM stressed customer satisfaction was number one. They’d say that yes they like excellent revenue but if customer satisfaction was high they would expect return customers in contrast to a one time purchase. They constantly reminded us we had to do something better than online sites and big box stores because customers have choices and if we were not going to be outstanding, why should someone take time to come into our store? This was my experience anyway.

    Reply
    • Jim Tincher
      Jim Tincher says:

      Peter, thanks for your comments. I’m definitely familiar with why this type of question is used. But it speaks to a lack of discipline. If a survey process’s validation requires you to make sure that your user is paying attention, that speaks to a failure in the process. This type of a question is a cop-out that is only required when you can’t say no to a survey question.

      However, that in no way invalidates your last paragraph, which is truly a best practice. Reviewing customer feedback with impacted employees is a best practice, and I like your description of how this was done.

      Reply
      • Avatar
        George Greene says:

        As the survey is likely online, a recaptcha like test up front could handle the “I’m not a robot” fear while respecting the customer. I’m not a CX professional but if your survey is so long that you need a question to be sure we’re still awake then maybe your survey is too long. When calling a company. if I get too many phone tree levels, I just lay on the”O” or say “Representative”. When I get too many survey questions, I just bail on the survey.

        Reply
  4. Avatar
    Tom Jackson (Jax) says:

    The phrase, “too clever by half,” comes to mind. I ran dozens and dozens of focus groups at BBY when I was a corporate manager there. They involved crafted questions, analysis, and then real followup conversations. It appears the new model relies on a one-size-fits-all platform that is about as refined as Survey Monkey. 50 questions?!! By the fourth page of this, the average customer is so ticked off that your results themselves get skewed downward. Even if they finish. It’s a bad plan. We used Gallop back then, for the Segment study. Go back to them.

    Reply
    • Avatar
      Karl says:

      This article was written back in 2012. I’ve taken numerous surveys from BBY since then and have never seen these questions. I’d be willing to bet a years’ salary that this survey is decommissioned at this point.

      If you’re going to post slander on a company (one that seems to be doing very well), please make sure you’re posting current news – not news of the past.

      That said – I also appreciate your guidelines of survey design, even if this example is not one that is current.

      Reply
      • Jim Tincher
        Jim Tincher says:

        Karl,

        More than fair. We do put old posts on social media – especially those with good past engagement. While it’s true that the specific example is old, the guidelines still apply. I continually run across market researchers who defend questions such as “Please answer 2 to this question,” which is why we post it. But you’re right – the headline is derogatory. As such, I’ve changed the headline and added a note on the date of the original post. Thanks for calling me out on this!

        Reply
        • Avatar
          Karl says:

          Thanks for the sincere and speedy response. Moreover, your courtesy response has me intrigued to follow your work. Thanks for posting thought provoking topics.

          -new follower karl

          Reply

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *