Hall of Shame

Is It Really This Hard to Proofread?

by Peter Leppik on Fri, 2017-09-22 14:13

From today's Daily WTF, we have a triptych of reader-submitted surveys that maybe should have been checked just a little more carefully before asking actual customers to fill out the forms.

First up, an online course evaluation which has a rather, um, interesting version of the tried-and-true Likert scale:

I Strongly Agree with the Number 1
I Strongly Agree with the number 1

Next up, Best Buy has a novel approach to the classic Net Promoter question. For true survey nerds, you can keep playing "Count the Mistakes" even after you stop laughing at the howler:

I guess I'm Not At All Likely.
I'm going to go with "Not at all likely."

And finally, WebEx has figured out the secret to not getting poor scores on a survey:

What better way to say Your Feedback Is Very Important to Us
What better way to say, "Your Feedback is Very Important to Us"?

 

Spurious errors

by Peter Leppik on Wed, 2016-06-15 12:35

I consider it something of a professional responsibility to take surveys when they're offered. I don't do every single survey (I tried that once a few years ago, and it wound up consuming way too much of my time), but I try to do them when I can.

A distressing theme is the number of bugs and errors I see in customer surveys. I haven't tracked it formally, but I would guess that at least one in ten online surveys I attempt is broken in some way. It's alarming how many companies are apparently unaware that their customer surveys don't work.

Today's example comes from MIcroCenter, where the survey form told me, "The errors below must be corrected before you can proceed." Which is all well and good if there had been any errors in the form, but there weren't.

So I guess MicroCenter won't benefit from my important perceptions and observations.

ABRA Is Not Subtle About Survey Manipulation

by Peter Leppik on Wed, 2016-01-13 15:57

Fellow CX professional Jason Kapel told me about a recent experience at ABRA Auto Body. He had his car repaired, and while the experience itself was fine, he found the flier to the right attached to his receipt.

If there was a Hall of Fame for the Hall of Shame, this would have a good chance of winning Most Blatant Example of Survey Manipulation. Not only does it tell the customer exactly how they want each question answered, at the bottom of the flier it instructs the customer not to take the survey if there was some problem with the repair.

Needless to say, this survey is not likely to get much honest and unbiased feedback from customers, nor is it going to identify ways to improve the customer experience. Pretty much the only thing this survey will do is allow the manager to boast about his great survey scores and claim whatever reward (or avoid whatever punishment) results from hitting his numbers.

All of which begs the question, what's the point of doing this survey?

I have to assume that either ABRA is unaware that their survey is being blatantly manipulated, or they don't care. Neither possibility speaks well to the level of commitment and attention the company is paying to improve their customer experience.

And don't come back!

by Peter Leppik on Wed, 2015-09-30 16:29

Another report of a car dealership's bad behavior in a customer survey: this time, a Ford dealer banned a customer because the customer gave honest but negative feedback.

It's as though the dealer doesn't actually care about providing a good experience, just getting a good survey.

Sadly, stories like this aren't even surprising anymore. The auto industry's survey processes are so heavily abused that it's almost more surprising if a dealer doesn't try to manipulate the survey.

I've written in the past that in situations like this, the company should just stop doing surveys rather than trying to fix such a broken process. The survey clearly isn't providing an honest measure of customer satisfaction, and all the cheating and manipulation is actively making the customer experience worse.

The Ford dealer who banned a customer for a bad survey is a great example of a company which has fallen into the "Metric-Centric Trap." The Metric-Centric Trap catches companies which, in an effort to become more customer-focused, become so caught up in measuring the customer experience that they lose sight of the actual customer.

Companies caught in the Metric-Centric Trap tend to focus their energies on gathering and improving their metrics rather than trying to understand and improve the customer experience. The problem with being Metric-Centric is that people are extremely complicated, and there is no way to directly measure all aspects of the customer experience. So any set of metrics is, at best, an approximate measurement of what's really going on.

Metric-Centric companies also tend to put heavy incentives on employees to improve their metrics. That can have the perverse incentive of encouraging employees to focus on the specific activities which are being measured, and ignore the things which aren't measured. And if it happens to be easier to "fire" an upset customer than train employees to do a better job, you get the situation with the Ford dealership.

Breaking out of the Metric-Centric Trap is not easy, and requires a significant cultural change. But companies caught in this situation often waste considerable time and money spinning their wheels on customer experience, and may even be making things worse despite as a result of the effort.

So we circulated a Word doc...

by Peter Leppik on Fri, 2015-08-14 14:47

So someone emailed around a Word doc with the survey design, and someone else edited it, then forwarded it to another person who copy-pasted it into the survey software, and the first person said it was good to launch so a fourth person uploaded the customer list and sent the invitations, and....wait, wasn't the Word doc already signed off? Why do we need to proofread it again?

Via The Daily WTF

Dear Xcel Energy: Here's Why Nobody Takes Your Survey

by Peter Leppik on Mon, 2015-06-15 17:20

Browsing the Xcel Energy website recently I was accosted by one of those ubiquitous popup surveys. You know the kind: one which asks if you'll take a survey about your browsing experience.

These surveys typically have an abominable response rate, and it's not hard to see why. Should you agree to take the survey you'll be asked to answer screen:

after screen:

after screen:

after screen:

after screen:

of questions. Thirty-two questions in all, of which 25 are required.

Clearly someone didn't get the memo about survey length.

What if you happen to notice a problem with Xcel's website, maybe a broken link somewhere, and you want to be a good samaritan and point it out to them?

Good luck with that. Because if you just write a short note in the comment box about the broken link and hit "Submit," here's what you get:

So much for being helpful (this is one of the reasons why Mandatory Survey Questions are Evil). If you're not interested in rating the "balance of text and graphics" on the site, or providing no less than three different ratings of the search function, Xcel doesn't want to hear from you.

Not that it would have mattered anyway: notice the text at the very bottom of the survey, "Please note you will not receive a response from us based on your survey comments."

To the customer that means, "Nobody will read the comments you write."

Xcel might as well save the bother of putting up a survey and just have a page that says, "We don't actually want your feedback but thanks anyway." It seems like a lot less trouble.

Can you spot the survey mistakes?

by Peter Leppik on Fri, 2015-03-27 15:02

Here's an amusing/horrifying story about a customer survey process gone horribly wrong:

Me: “Sir. Why are you giving us bad grades on the survey? You said everything was good.”

Customer: “Oh. Everything was good. I just didn’t like the movie. It was confusing.”

Me: “Sir, the surveys are not for the film itself. They’re for the theater and our staff.”

Customer: “Oh, but I want the studios to know I didn’t like the movie.”

Me: “That’s not how these surveys work. We don’t make the films; we just show them. The surveys are for customers to give feedback on how we performed. It’s a common mistake people make, but I’m going to strongly encourage you not to submit that survey.”

Customer: “Why not?”

Read the full story. Can you spot all the things this company is doing wrong in its survey process? Here's a partial list of mistakes I saw:

  1. The customer has to ask for a survey form, from the staff.
  2. The survey is designed in a way that it doesn't deal with the (apparently common) problem of customers reviewing the movie not the theater.
  3. At least some customers think the survey goes to the studio, not the theater chain.
  4. Customers can fill out the form with staff watching, and the staff can apparently try to talk the customer out of the survey.
  5. Despite the flaws in the process, the survey is apparently used to fire and promote people.
  6. Even a single bad survey is enough to cause serious problems for the theater staff.

For extra credit: how would you design a feedback process for a movie theater which actually works for its intended purpose?

We Value Your Feedback!

by Peter Leppik on Wed, 2014-11-26 13:50

Via the Daily WTF, it seems that Adobe has a unique way to show customers just how seriously the company takes its customer surveys (click the image for a bigger version which is actually legible):

I mean, c'mon guys, I'm sure your QA department is pretty busy testing all your software products, but would it kill you to proofread the survey once in a while?

Explosion of Really Bad Surveys

by Peter Leppik on Fri, 2014-10-24 14:28

Local newspaper columnist James Lileks takes some well-deserved (and hilarious) potshots today at bad surveys.

He also reveals that, back in college, he did a turn as a telephone interviewer. So he at least has some sympathy for what it's like to be in the survey biz.

It does seem like there's been an explosion of really bad surveys over the past several years. Personally I blame the confluence of several factors:

  1. Online surveys have gotten really cheap and easy. This means organizations do more surveys but at the same time put less care and effort into designing the survey. Gone are the days when doing anything but the smallest survey meant hiring a market research company (for a minimum of $50K). It's distressingly common to see surveys riddled with typos, nonsensical questions, and other problems which make it clear nobody could be bothered to do a good job.
  2. Yet the long-form survey style somehow persists. When surveys were rare and expensive it made sense to ask every imaginable question because you needed to squeeze every possible insight from each participant. Today this mindset continues, even though surveys are cheap and common, and it's not unusual for a consumer to be asked to respond to literally hundreds of questions about a single three-day trip.
  3. And consumers are refusing to respond to bad surveys. Across the industry you hear people complaining that response rates are down on email surveys. But instead of asking the sensible question ("Why don't people want to take our survey?"), many companies respond by simply sending more survey invitations. To the same badly-designed, overly-long survey that 98% (or more!) of their customers won't fill out.

These problems won't be easy to solve, mostly because the root cause is that most organizations don't care as much about the customer experience as they say they do. This has always been the case--when it comes to customer service most companies talk the talk much better than they walk the walk--but the difference is that today it's easy to just do a survey instead of doing something.