Hall of Shame

This Is Why Survey Design is Hard

by Peter Leppik on Fri, 2017-12-15 15:52

Not everyone is going to immediately spot the problems I saw with this online survey I got from Discover Card this week. But some people will, especially those who have some knowledge of User Interface design.

Screen capture from Discover Card survey

When designing this survey someone thought it would be cute to have the selection buttons shade from red to green. I have no idea where this idea came from, but it seems like the sort of thing that might come from a graphic designer and get put in place without bothering to consult anyone who understands user interface or survey design.

The first problem that jumped out at me when I saw this survey is that the grey in the middle for "Neutral" makes it look like the "Neutral" button is disabled (in the screen shot, "Neutral" is selected, which is why its circle is filled in rather than open). It's become a standard part of user interface design to indicate that a control is disabled by greying it out, so at first glance some users might think that Neutral isn't actually an allowed option on this survey.

That's something that could affect the outcome of the survey at the margins. Does it? I have no idea--and I'm guessing that Discover Card didn't calibrate the survey to see if their color choices make a difference. But it's certainly plausible, which is one reason survey design is hard. So many things can affect the results of a survey that you need to be careful to either understand the design choices, or make sure that the analysis and decision-making process is robust in the face of subtle survey biases.

There was another problem I immediately spotted with this survey, one which most people won't notice but which 7%-10% of the male population will immediately see (or, in this case, not see). The particular shades of red and green used in this survey are ones which are hard to distinguish for people with the most common form of color blindness. So for me, and a significant minority of the population, whatever Discover meant to communicate through the colors of the buttons is completely lost because we can't easily tell the difference. Since there are color palettes out there designed to be accessible to colorblind people, and this is another important detail that good User Interface designers know to watch out for, I am again led to the conclusion that Discover didn't really think about their color scheme very carefully before dropping it into their survey.

(Those with normal color vision will probably be surprised to read that before I wrote this article, I actually used Photoshop to verify that the colors in the image really are red and green. It would be embarrassing to get such an important detail wrong.)

I sometimes like to say that survey design isn't hard, what's hard is dealing with the people who think it's easy. It's not hard to design a reasonable survey, but there are a number of details which can change the outcome. But because designing a survey looks easy, often people will want to make changes without thinking through the implications. This survey is a great example of a seemingly-trivial design choice which might actually impact the data, and which clearly isn't necessary.

Is It Really This Hard to Proofread?

by Peter Leppik on Fri, 2017-09-22 14:13

From today's Daily WTF, we have a triptych of reader-submitted surveys that maybe should have been checked just a little more carefully before asking actual customers to fill out the forms.

First up, an online course evaluation which has a rather, um, interesting version of the tried-and-true Likert scale:

I Strongly Agree with the Number 1
I Strongly Agree with the number 1

Next up, Best Buy has a novel approach to the classic Net Promoter question. For true survey nerds, you can keep playing "Count the Mistakes" even after you stop laughing at the howler:

I guess I'm Not At All Likely.
I'm going to go with "Not at all likely."

And finally, WebEx has figured out the secret to not getting poor scores on a survey:

What better way to say Your Feedback Is Very Important to Us
What better way to say, "Your Feedback is Very Important to Us"?

 

Spurious errors

by Peter Leppik on Wed, 2016-06-15 12:35

I consider it something of a professional responsibility to take surveys when they're offered. I don't do every single survey (I tried that once a few years ago, and it wound up consuming way too much of my time), but I try to do them when I can.

A distressing theme is the number of bugs and errors I see in customer surveys. I haven't tracked it formally, but I would guess that at least one in ten online surveys I attempt is broken in some way. It's alarming how many companies are apparently unaware that their customer surveys don't work.

Today's example comes from MIcroCenter, where the survey form told me, "The errors below must be corrected before you can proceed." Which is all well and good if there had been any errors in the form, but there weren't.

So I guess MicroCenter won't benefit from my important perceptions and observations.

ABRA Is Not Subtle About Survey Manipulation

by Peter Leppik on Wed, 2016-01-13 15:57

Fellow CX professional Jason Kapel told me about a recent experience at ABRA Auto Body. He had his car repaired, and while the experience itself was fine, he found the flier to the right attached to his receipt.

If there was a Hall of Fame for the Hall of Shame, this would have a good chance of winning Most Blatant Example of Survey Manipulation. Not only does it tell the customer exactly how they want each question answered, at the bottom of the flier it instructs the customer not to take the survey if there was some problem with the repair.

Needless to say, this survey is not likely to get much honest and unbiased feedback from customers, nor is it going to identify ways to improve the customer experience. Pretty much the only thing this survey will do is allow the manager to boast about his great survey scores and claim whatever reward (or avoid whatever punishment) results from hitting his numbers.

All of which begs the question, what's the point of doing this survey?

I have to assume that either ABRA is unaware that their survey is being blatantly manipulated, or they don't care. Neither possibility speaks well to the level of commitment and attention the company is paying to improve their customer experience.

And don't come back!

by Peter Leppik on Wed, 2015-09-30 16:29

Another report of a car dealership's bad behavior in a customer survey: this time, a Ford dealer banned a customer because the customer gave honest but negative feedback.

It's as though the dealer doesn't actually care about providing a good experience, just getting a good survey.

Sadly, stories like this aren't even surprising anymore. The auto industry's survey processes are so heavily abused that it's almost more surprising if a dealer doesn't try to manipulate the survey.

I've written in the past that in situations like this, the company should just stop doing surveys rather than trying to fix such a broken process. The survey clearly isn't providing an honest measure of customer satisfaction, and all the cheating and manipulation is actively making the customer experience worse.

The Ford dealer who banned a customer for a bad survey is a great example of a company which has fallen into the "Metric-Centric Trap." The Metric-Centric Trap catches companies which, in an effort to become more customer-focused, become so caught up in measuring the customer experience that they lose sight of the actual customer.

Companies caught in the Metric-Centric Trap tend to focus their energies on gathering and improving their metrics rather than trying to understand and improve the customer experience. The problem with being Metric-Centric is that people are extremely complicated, and there is no way to directly measure all aspects of the customer experience. So any set of metrics is, at best, an approximate measurement of what's really going on.

Metric-Centric companies also tend to put heavy incentives on employees to improve their metrics. That can have the perverse incentive of encouraging employees to focus on the specific activities which are being measured, and ignore the things which aren't measured. And if it happens to be easier to "fire" an upset customer than train employees to do a better job, you get the situation with the Ford dealership.

Breaking out of the Metric-Centric Trap is not easy, and requires a significant cultural change. But companies caught in this situation often waste considerable time and money spinning their wheels on customer experience, and may even be making things worse despite as a result of the effort.

So we circulated a Word doc...

by Peter Leppik on Fri, 2015-08-14 14:47

So someone emailed around a Word doc with the survey design, and someone else edited it, then forwarded it to another person who copy-pasted it into the survey software, and the first person said it was good to launch so a fourth person uploaded the customer list and sent the invitations, and....wait, wasn't the Word doc already signed off? Why do we need to proofread it again?

Via The Daily WTF

Dear Xcel Energy: Here's Why Nobody Takes Your Survey

by Peter Leppik on Mon, 2015-06-15 17:20

Browsing the Xcel Energy website recently I was accosted by one of those ubiquitous popup surveys. You know the kind: one which asks if you'll take a survey about your browsing experience.

These surveys typically have an abominable response rate, and it's not hard to see why. Should you agree to take the survey you'll be asked to answer screen:

after screen:

after screen:

after screen:

after screen:

of questions. Thirty-two questions in all, of which 25 are required.

Clearly someone didn't get the memo about survey length.

What if you happen to notice a problem with Xcel's website, maybe a broken link somewhere, and you want to be a good samaritan and point it out to them?

Good luck with that. Because if you just write a short note in the comment box about the broken link and hit "Submit," here's what you get:

So much for being helpful (this is one of the reasons why Mandatory Survey Questions are Evil). If you're not interested in rating the "balance of text and graphics" on the site, or providing no less than three different ratings of the search function, Xcel doesn't want to hear from you.

Not that it would have mattered anyway: notice the text at the very bottom of the survey, "Please note you will not receive a response from us based on your survey comments."

To the customer that means, "Nobody will read the comments you write."

Xcel might as well save the bother of putting up a survey and just have a page that says, "We don't actually want your feedback but thanks anyway." It seems like a lot less trouble.

Can you spot the survey mistakes?

by Peter Leppik on Fri, 2015-03-27 15:02

Here's an amusing/horrifying story about a customer survey process gone horribly wrong:

Me: “Sir. Why are you giving us bad grades on the survey? You said everything was good.”

Customer: “Oh. Everything was good. I just didn’t like the movie. It was confusing.”

Me: “Sir, the surveys are not for the film itself. They’re for the theater and our staff.”

Customer: “Oh, but I want the studios to know I didn’t like the movie.”

Me: “That’s not how these surveys work. We don’t make the films; we just show them. The surveys are for customers to give feedback on how we performed. It’s a common mistake people make, but I’m going to strongly encourage you not to submit that survey.”

Customer: “Why not?”

Read the full story. Can you spot all the things this company is doing wrong in its survey process? Here's a partial list of mistakes I saw:

  1. The customer has to ask for a survey form, from the staff.
  2. The survey is designed in a way that it doesn't deal with the (apparently common) problem of customers reviewing the movie not the theater.
  3. At least some customers think the survey goes to the studio, not the theater chain.
  4. Customers can fill out the form with staff watching, and the staff can apparently try to talk the customer out of the survey.
  5. Despite the flaws in the process, the survey is apparently used to fire and promote people.
  6. Even a single bad survey is enough to cause serious problems for the theater staff.

For extra credit: how would you design a feedback process for a movie theater which actually works for its intended purpose?

We Value Your Feedback!

by Peter Leppik on Wed, 2014-11-26 13:50

Via the Daily WTF, it seems that Adobe has a unique way to show customers just how seriously the company takes its customer surveys (click the image for a bigger version which is actually legible):

I mean, c'mon guys, I'm sure your QA department is pretty busy testing all your software products, but would it kill you to proofread the survey once in a while?