Surveys & Feedback

Customer Experience Surveys & Feedback

Customer experience surveys are powerful tools that allow you to gather data about customers’ opinions and feedback about their personal experience with your brand. The insights gained from these surveys can be invaluable, within limits and when utilized appropriately. They provide data that can help identify what matters most to your customers, reveal problem areas, and highlight opportunities for growth and development.

Creating an effective customer experience survey is about more than asking “how likely are you to recommend this business?” It’s about gathering key insights, and most importantly, showing customers you value their feedback by taking acting on the results.

Celebrating #CXDay

I hope you had a terrific CX Day!  I really enjoyed the online content – if you didn’t get a chance to view it, I highly recommend going to www.CXDay.org for a chance to review them. Here in Minneapolis we had an amazing event, with over 80 participants learning about what to do when your customers are tired of talking to you (survey fatigue).

20141007_182400 20141007_182405 20141007_185057

 

Who Was There?

I posted earlier about local events at Wolters Kluwer and ShopHQ.  Today I have photos from two other celebrations – at UnitedHealthcare Medicare and Retirement and Allianz Life.

united-healthcare-logoOne of the UnitedHealthcare activities was a booth in their commons area. They engaged employees in quick conversations about grounding their decisions based on VOC and stressed the importance of plain and simple language in communications. They completed by asking for a commitment of what they will do to make the experience better for our consumers.  As Lisa Wilson, Senior Director of the Member Experience explained, “We aim to keep it fun, simple and impactful!”

They also hosted several of the CXPA webinars throughout the day. Lisa wanted to make sure I said, “Thank you to the CxPA for providing such a suite of opportunities to help us raise our game here at UHC!”  Below are two photos of their day – love the selfie!

IMG_2022

IMG_2023

 

 

 

 

 

 

 

 

Allianz logo

 

 

Allianz also had a great celebration.  I’ve posted about their strong communications program in the past, and it was evident on CX Day as always. They have televisions throughout their offices, and for this day they focused exclusively on customer experience topics. I’ve included one example below, which links to more of their posters. Because this is also Customer Service week, the rest of the week they are focusing on service, with numerous activities designed to engage and recognize employees who work in Operations, including in the Call Center.

Allianz CX Day Poster

Recap

Lastly, Director of Customer Experience Barbara Norrgard explained, “We also recently ran a contest where we asked people what they are doing to achieve our aspiration and we ran the article today.” Winners were announced to the entire company, celebrated for the impact they have on the customer experience.

So there you have it – two more excellent ideas you can use for next year.  We’re down to only 363 planning days before our next CX Day!

Are you actively interfering with your mission?

hobylogoI’ve been active in HOBY Minnesota for seven years now.  HOBY is an international program that offers annual leadership seminars to high school sophomores, challenging them to log 100 hours of community service in the following year. We have a clear vision on what we need to measure.  Whereas businesses often use revenue as a primary measurement, we focus on logged community service hours.

But as with revenue, logged hours is a trailing indicator. So how do we get a sense on how we’re doing while at the seminar?

Examples Across Fields

This isn’t just a non-profit question.  My clients struggle with this, as well.  When we build our customer experience program, how do we measure how we’re doing today, so we can predict tomorrow’s results?  And most businesses get it wrong, because they focus on what feels right.

Two quick examples: Read more

Add Measurements to Your Customer Experience Metrics

I led the “Developing Customer-Focused Metrics to Drive Your Customer Experience (B2B)” Unwound Sharing Session at last week’s CXPA Insights Exchange. This was a session where participants shared what’s working for them.

As we shared our best practices, one member pointed out how we were all focusing on metrics – questionnaire-based responses from customers. And sure enough, most of the debate revolved around whether to use Net Promoter Score, the Loyalty Index, satisfaction, or another survey-based metric.  This makes sense – we often have a budget for this type of work, and this is one of the few areas where the customer experience team may actually have some control.  So it’s what we typically use to gauge how our customer experience is doing.

And what’s wrong with that?  Nothing by itself. Except that these measurements can feel disconnected for your teams that are trying to deliver a great customer experience. Telling teams to improve their Net Promoter Score is equivalent of telling managers to make their employees happier.  Both are good goals, but neither gives any direction about how to do it. Read more

Off-Topic: Nominate a Minnesota community leader

As a high school sophomore, I attended the HOBY conference. HOBY is an international organization that develops leadership in high school students. In Minnesota, youth from the last six years have provided over 26,000 hours of service back to their communities!

We host an annual Cheers Dinner to recognize not just our youth, but also those in the greater community who truly make a difference. We are looking for nominations in the following categories:

  • HOBY Outstanding Youth Award:  Awarded to an outstanding Minnesota youth, 13 to 18 years old, who has a consistent track record of volunteerism and clearly demonstrates a sense of social responsibility. The recipient does not need to be a HOBY alumnus.
  • HOBY Outstanding Community Leadership Award: Awarded to an adult who makes a difference in the lives of youth, with a focus on developing leadership and integrity in children and young people. The nominee must be 18 years of age or older, but does not need to be a HOBY alumnus.
  • Outstanding HOBY Alumnus Award: An award given to a HOBY alumnus who continues to give back to the community. This person must be a HOBY alumnus, but does not need to be an alumnus of the Minnesota program.

You can view more in the HOBY Cheers Dinner Nomination Flyer, or go to http://jotform.us/dherbel/2014_HOBY_Cheers_Award_Nomination to nominate somebody today!

Logitech: Sometimes Automation isn’t Your Friend

I received this email today.  While this is a B2C example, I think we can all see the risks inherent to any of our businesses.  I did not edit this email at all, outside of deleting the reference number.

Hi Jim,
This is <agent first name>, from The Logitech Customer Care Team.
How’s everything going, Jim?  We have sent a response and we haven’t heard back anything from you. We just want to make sure that we were able to address your concerns before the system automatically tag your case as closed.
Is there anything else I can help you with? If your issue has not been resolved, please do not hesitate to update me.
If you have any additional questions, please feel free to visit our website at http://logitech.com  or reply to this e-mail.
This is your support reference number: [reference number].
Thank you for choosing Logitech and have a wonderful day.
Sincerely Yours,
<agent first name>
Logitech Customer Care Inc.

Now, this wasn’t the most personal email I’ve ever received.  Especially since this is the third time I have received this exact email from a tech!

Scripts are useful, and they help ensure consistent service.  But over-reliance on them really doesn’t help. Using the exact same message coming three times shows you are inauthentic, let alone signing off as <agent first name>!

Have a great holiday weekend!

<customer experience blogger>

Customer Effort Score: How Hard is it to be Your Customer?

How much effort is your customer experience?Are you familiar with the Customer Effort Score (CES)?  It is rapidly gaining converts as a way to measure the transactions that make up your customer experience.

(Editor’s note: More details on the CES 2.0 can be found here.)

The Net Promoter Score, or NPS, measures your overall customer experience.  But it doesn’t show where to focus to improve your results.  Imagine telling your store manager, B2B sales team, or director of your call center only that “Your NPS scores are low. Fix them!”  Where do they begin?

Transactional measurements show what segments of your experience impact your customer loyalty. Some companies have tried to use NPS to measure transactions, but it was never designed for this.  Asking “Would you recommend your call center rep?” doesn’t work, as most customers have no desire to call your call center in the first place.  Similarly, “Would you recommend [Company] website”  causes confusion – are your customers recommending the company behind the website, the design, the functionality, or all three?  This is where the Customer Effort Score shines.

When customers have to expend more effort than they expect, they leave.  High effort equals low customer loyalty.  The CES helps you monitor this.

Read more

Surveys – a Force for Good or Evil?

The Internet is a wonderful thing.  With little effort, we can connect to hundreds (or millions!) of people.  That access makes it really easy to conduct surveys.  So easy, in fact, that we no longer have to spend much time thinking about it.  And it’s obvious that many companies don’t.

While a proper survey can teach you about your customers, poor surveys lead you down the wrong path, sacrificing development dollars on delivering something your customers just don’t want.

Survey problems show up in three ways:

  • Asking for opinions instead of using readily available data
  • Outsourcing your thinking to your customers, asking them what you should develop
  • Piling on “just one more question”

Surveys vs. Data

It has now become easier to ask a survey than to do actual research.  Just because you can ask a survey, though, doesn’t mean that you should.

About 18 months ago the financial management website Mint conducted a survey that has been used by a host of speakers purporting to show the huge impact the economy has had on spending habits. One often-used slide:

You can see one presentation using this data at http://slideshare.net/MirrenBizDev/2010-new-business-conference-mintel. This slide is used to show how consumers are abandoning credit cards – 12% discontinued their use in 2009!

Not likely. Can you imagine the ripple effect if one out of every eight consumers completely discontinued the use of credit cards? The fallout would be massive!

The biggest problem is the question Mint asked.  Just because respondents said they discontinued credit cards does not mean they actually did it. Worse, the real data is only a short search away. What was the real change in credit card usage in 2009? According to the Fed, credit cards did decline – but by 0.2%! Yes, this is a dramatic change from the growth of previous years – but nothing like the impact that the Mint survey suggests.

Predicting the Future with Surveys

Survey data are frequently used as input to business decisions. Asking customers what we should develop feels right – but doesn’t work.  Consumers are notoriously bad at predicting what they want. Take this survey by the Consumer Electronics Association. While it’s dated, I saw the waste it generated at a consumer electronics retailer firsthand.

In this survey the CEA asked consumers what content they wanted to watch on their HDTVs. 47% said they wanted to watch home videos, while 44% wanted to view digital photos. This survey was cited in numerous business cases, and the retailer developed dozens of endcaps showing customers how they could do this through adding a computer to their home theater or connecting their Xbox 360 to the computers on their home network. We invested hundreds of thousands of dollars in these displays – likely millions when inventory is considered – yet sold very few.  What went wrong?

You can’t ask customers to predict the future – even their own behavior.  When asked whether they wanted to see their home videos on their computer, almost half the respondents clicked yes. Clicking a Yes box is a far cry from actually purchasing a thousand dollars of equipment and installing it into your home theater.  When it came to actually installing a computer to the home theater, very few were willing to take that step in order to watch their videos of photos.  Predicting the future is always risky business – this survey is just asking for trouble.

Yet, there is some truth to this data. Consumers clearly did want a better way of viewing their home photos. But when compared with the daunting task of getting computer content onto their TV, most took the sensible path of a digital photo frame – much easier, with almost the same result. Surveys are a great way to learn about your customer – but not a great way to learn what they will do.

Question 21.1.2.1

There is also the issue of the rapidly growing survey. Since it’s easy to ask 5 questions, why not 10? 20? Or, in my favorite “Bad Survey” example, why not 45?

This survey is by one of my favorite retailers. But it is a poster child for bad survey design, featuring a total of 45 questions, 40 of which are required. There’s even a question “21.1.2.1!”

When you’re writing a survey, it’s tempting to include everybody’s input. And that’s a good idea. But every question you add results in a few more customers dropping out.  Surveys require discipline:  prune the non-critical items to be sure customers will give you good data on what is left.

————

Does this mean that you don’t need surveys? Of course not – well-designed surveys provide critical input. But you need to spend the time to do surveys right. Some tips to success:

  1. Start with the end in mind. What is the #1 thing you need to learn? Is the rest critical? If you need to accomplish two very different things, consider a second survey.
  2. Decide whether a survey is really the right tool. If you want to understand behavior, observational data or behavioral analytics will typically give you much better results.  Surveys are best if you want to compare data over time, or compare results from two different groups.  Just keep in mind that the specific numbers (47% want to watch videos on their TV) are almost certainly wrong.  It’s not about predicting the future – it’s about understanding customer needs.
  3. If a survey is the right tool, determine how much patience your target market has. If it’s a free survey, keep it to no more than 5-7 minutes in length.  This is especially true for satisfaction or NPS surveys – keep these focused on this specific outcome, and use other surveys for market research.
  4. Consider using an expert to help you design the questions. Poorly-phrased questions will give you data – but sometimes customers answer different questions than you think you’re asking. If you cannot afford an expert, at least use an outsider to review what you develop.
  5. Test out the survey first. Have people outside the development team take the survey, and talk to them as they go through it – make sure their understanding of the question matches yours.

While the Internet makes it cheap and easy to do a survey, it also makes it cheap and easy to do crappy work. But if you take the time to do them right, surveys can be an excellent view into the Heart of Your Customer!

– Jim Tincher, Heart of the Customer