The Customer Service Survey

Vocalabs' Blog

Blogs

Issue #71 of Quality Times is published

We just published Issue #71 of our mostly-monthly newsletter, Quality Times.

In this issue we discuss the National Customer Service Survey results for the first half of 2013. We published midyear updates for both the Communications and Banking reports. The Communications report is the old mobile phone report, which we are in the process of expanding to include wireline phones, Internet service, and cable/satellite TV services. Included in this midyear update is some preliminary data on Comcast and DirecTV.

As always I hope you find this useful and informative. I welcome any comments and suggestions.

SpeechTEK 2013 Wrap-Up

SpeechTEK 2013 is over (except for a few die-hards). I presented in two sessions and spend a lot of time talking to clients (present and future). Here are some of my favorite moments, embellished and redacted as I see fit:

  1. In a Tuesday morning panel about mobile devices, AI, and speech. The discussion had turned to the "always listening" model for cloud-based personal assistants when the moderator basically asked, "don't you think it's a little creepy that some big company is going to be listening to everything we say?" Maybe it's a sign of the moment, but there was actually a pretty good discussion of privacy and security implications, instead of just dismissing the concerns.
  2. A Wednesday session where a presenter bravely admitted that, because of an obscure glitch, their phone system had been hanging up on a certain population of callers. Every. Single. Time. Because the problem was outside the testing parameters, they didn't discover it until it was written up in the newspaper. Ouch.
  3. Tuesday's networking reception: the food didn't run out after the first hour (like it has in some prior years), and people stuck around and networked. Huzzah!

 

Tracking Bad Service with a Bad Survey

After my rant about Whirlpool a couple days ago, it seems only fitting that Whirlpool also provided me with an excellent case study in how not to do a customer service survey.

Friends, if you are looking for accurate, actionable, and meaningful customer feedback don't do the following:

  1. Call back 36 hours after a customer service call for a survey.... [To make sure I've forgotten important details]
  2. ....with an automated IVR survey.... [Are you trying to make me mad?]
  3. ....using an unconventional question scale.... [1-5 with 1 being best, 5 being worst, opposite of the usual]
  4. ....with no opportunity to provide free response feedback.... [I sure had things to say. Too bad I couldn't say them]
  5. ....and no follow-through with the customer. [I gave very low scores. What happened? Nothing]

When I called Whirlpool to get my brand-new drier fixed, the company sent me the very clear and unambiguous message that they didn't care about my problem and would not do anything more than the absolute bare legally-required minimum to help me.

Whirlpool's customer service survey manages to reinforce that message. This is not listening to customers, it's a kabuki dance of customer feedback designed to create the form of a survey with none of the substance (all while, not incidentally, spending as little money as possible).

And I'm not alone. Read about the numbingly legalistic runaround a New York Times reader got over a defective microwave oven.

Postscript: After I tweeted my earlier rant to Whirlpool, they called me in response. Their message? That the earliest they could get a technician to look at the drier would be in a week. I'm starting to think that the only actual humans at Whirlpool are mind-controlled slaves of the SAP database which makes all the decisions. Evidently, nobody who works there is allowed empathy, good judgment, or common sense.

Dear Whirlpool, When I Spend A Couple Grand on Appliances, I Expect Prompt Repair Service When Something Breaks a Week Later

A week ago, I had a new Whirlpool refrigerator and clothes drier installed. The drier replaced a Whirlpool my wife and I bought when we were first married over 20 years ago.

This new drier has not been nearly as trouble-free as its predecessor, however. What started out as a slight rattle after a few days of use has developed into an awful grinding noise which sounds like a piece of industrial equipment trying to destroy itself.

Even if this noise is harmless (which I doubt), it was time to call for repair. This morning I called the appliance store where I bought it, hoping they might be able to come out today to look at it. My family is going to the North Woods in a few days, and as soon as we get back from that I'm going on a business trip. We really need to be able to get our laundry done. The store's service department was booked, but they gave me the number of another local Whirlpool specialist.

The other repair company was also booked for today, but they also told me that even if they had time available, they are not allowed to fix my drier unless I went through Whirlpool's main customer service line. This is apparently a new rule they recently put in place. But, since Whirlpool could dispatch any repair shop in the city, maybe they would be able to find an earlier appointment.

Unfortunately, Whirlpool did not live up to my expectations.

After navigating their phone menus and waiting on hold, Whirlpool's service line took my details and politely informed me that the earliest they can get someone out to look at my broken drier is next Tuesday. I explained that this really wasn't acceptable because of our upcoming vacation, and the friendly customer service representative told me that there was absolutely no way they could get anyone out any earlier. That extra mile is apparently just too far to go.

So I (reluctantly) accepted an appointment for after our Trip Up North, almost two weeks in the future. But like the cable guy, Whirlpool also apparently can't schedule with any more precision than "between 8 AM and noon." That will require my wife to take a half-day off work (I will be on a business trip). And there is absolutely no way they can give us the first slot of the day or do anything else to try to make it easier on us. We are expected to do whatever it takes to accommodate the whims of Whirlpool's dispatch system.

[As an aside--I wonder if Whirlpool's call center allows its agents to take a half-day off work to stay home for the appliance repair guy?]

I fully appreciate that this is a First World Problem, but when I spend well over $2,000 on new appliances, and one of them starts sounding like a drill press on "auto-destruct" mode after only a week, I expect the manufacturer to fix it.

Promptly.

Meaning today, or maybe tomorrow if they're backed up. Making me wait a week says that they're (a) too cheap to pay some overtime to clear the backlog, (b) too disorganized to hold standby and reserve slots in case of emergencies or cancelations, and (c) don't care enough about me as a customer to do more than the bare minimum.

NCSS Banking Midyear Update is published

in

We just published a midyear update for the National Customer Service Survey on Banking customer service. The Executive Summary is available from our website (along with lots of other research).

In addition to updating the key metrics for Bank of America, Chase, Citi, and Wells Fargo; we also took a closer look at two common issues in customer service: having to be transferred to a different agent, and spending too much time on hold.

Related Research

>> Get a copy of the National Customer Servuce Survey on Banking, midyear update 2013

>> Get a copy of the Agile Customer Feedback Whitepaper

>> Browse other Vocalabs Research

Credibility

The power of a customer service survey comes from its credibility.

Credibility is what convinces customer-facing employees that they need to listen when a customer says he wasn't served well. Credibility is what gives executives the confidence to make decisions based on the customer feedback.

Unfortunately, many customer survey programs lack credibility. Without credibility, negative feedback (which is normally a gift!) is easy to ignore. Some common things which undermine credibility in a survey are:

  • Survey results aren't available until weeks or months after a customer interaction.
  • Obvious flaws in the survey process.
  • Manipulation of which customers take the survey, or even outright cheating and misreporting results.
  • No way to connect an individual employee's actions with the results of the survey.

I have personally observed: call center employees only allowing "good" calls to go to the survey, an executive who misreported survey scores in order to hit his target, survey scores delivered three months after a customer experience, and survey questions so confusing that nobody knew what they're supposed to mean. When employees see these things happening, it gives them license to ignore the feedback process because they perceive it as no longer meaningful or relevant.

Here are some tips for making a survey process more credible:

  1. Collect feedback as close to real-time as is appropriate and practical.
  2. Connect individual surveys to the customer experience which the survey is about.
  3. Deliver data in real time to all levels of the customer service operation.
  4. Update survey questions regularly to keep them relevant to current business needs.
  5. Get input from throughout the organization about how to improve the survey.
  6. Monitor for signs of manipulation and cheating.

NCSS Communications Midyear Update

in

We just released the 2013 Midyear Update for the National Customer Service Survey report on Communications Services.

This is the report which we used to call "Mobile Phones," but beginning this year we are collecting data on customer service at other consumer communications services companies, like landline phones, cable/satellite TV, and high speed Internet service.

We don't yet have enough data to provide full statistics on the new companies, but we have included a "sneak peek" at some of our preliminary results. You can get a copy of the report for free on our website.

Related Resources

>> Get a copy of the NCSS Communcations Services 2013 Midyear Update

>> Get a copy of the NCSS Mobile Phone 2012 Executive Summary

>> Read about the 2012 NCSS data

Customer Experience and Home Repair

It's not just big companies which provide bad customer service. Just ask any homeowner. Contractors are notorious for providing inconsistent-to-terrible customer service.

In my home, we just finished an insulation upgrade project. Of the first four contractors I reached out to last fall, only two ever got back to me. The other two came and looked at our project and promised to send us bids, but we never heard from either of them again (this is sales 101, guys, c'mon).

So we found two more contractors through Angie's List; they were both more responsive and did write estimates for us, but one of them didn't respond promptly when we had questions about his bid, and wasn't able to provide good answers.

The contractor we eventually chose did everything right from a customer service perspective: they scheduled the work well in advance, showed up right on time, explained what they were doing and answered questions, finished on time and cleaned up after themselves.

But the amazing thing is that five of the six contractors eliminated themselves not on the basis of price or the quality of their work, but because they couldn't handle basic communicating with the customer (i.e. me). And I don't think my experience is all that unusual, either.

So why does this happen? And how do companies like this stay in business?

Some issues are just unique to the world of home repair and contracting: schedules slip, weather causes delays, employees are sometimes unreliable, etc.

But I think there's something else going on. Most home contractors are small businesses. Sometimes the owner has the customer service gene, but often he's a tradesman first and sales/service come far behind.

And there will always be customers who hire the first guy they can reach, or who have an emergency and will put up with almost anything to get the job done quickly. So without a brand to protect, a contractor can get by even if he doesn't respond to every sales inquiry or communicate well with customers. He may not build a big business, but that might not be what he wants.

That's why homeowners like me have to call a half-dozen different companies in order to find the one who is easy to do business with.

Survey Fail

This picture (which comes to me from Failblog) nicely encapsulates two of the key challenges in collecting customer feedback:

First, you need to make sure your customers can actually take the survey. Many survey methodologies systematically exclude some segment of your customer base. Unless you account for that there's the risk you could be overlooking something important.

Second, you need to make sure the survey itself doesn't bias the responses, either through poor questions, or because the process (like in this case) annoys your customers.

For what it's worth, I had my own problems with an AT&T text message survey last year.

Related Resources

>> This Three Question Survey Takes Three Hours (and really has four questions)

>> Process Bias

>> Agile Customer Feedback

Easy Ideas for Getting More from your Customer Service Survey

A few weeks ago we released the Customer Service Survey Maturity Model, and along with it, an online self-assessment tool to find out where your feedback program fits into the maturity spectrum.

Maybe I should have expected this, but I was surprised to see several people use the self-assessment tool as a way to generate some ideas for improving their survey programs. In the self-assessment we ask about a few dozen different "pretty good practices," and that got some mental gears turning.

The great thing about using the tool this way is that you can take the self-assessment as many times as you want. So you can preview what kind of feedback program you would have if you adopted certain practices.

So I want to encourage everyone to give this a try. It's quick, easy, and free, and you'll probably come out of it with some things you can be doing to get more from your customer feedback program.

>> Take the Customer Service Survey Maturity Self-Assessment

Three Ideas to Get Started with Customer Service Feedback

I often get asked how to get started with a customer feedback program, when the organization doesn't have any surveys or other feedback processes around the customer experience. Our Customer Service Survey Maturity Model is a useful tool for understanding what's possible and how to get the most out of a feedback program, but most companies don't want to dive right into a maturity level four program on day one. That's a lot of commitment (and culture change), and it usually makes sense to start smaller and work up.

So what's the best way to get started?

I see a lot of companies which decide they want to start a voice of the customer program, but don't give much consideration as to why or what they hope to accomplish.

In keeping with the principles of Agile Customer Feedback, even a "starter program" should address current business needs, respect and listen to customers, and be designed to tell you something you don't already know.

There are lots of ways to get your feet wet in customer feedback. Here are three ways you can start collecting customer feedback in a customer service environment which are likely to be easy and show a quick return on your investment of money and effort.

Idea 1: Start asking a "Question of the Week" on customer service calls.

Customer feedback can take many forms, and it doesn't have to be a formal survey program. You can start by having customer service reps ask a "question of the week" at the end of calls and making a note the answer. For example:

  • "One last thing before we hang up. We're looking for ways to improve our customer service. Do you have any suggestions?"
  • "Before we go, we're looking for ideas to improve our website. Is there anything you wish our website did better?"
  • "One quick question. We're trying to improve the automated part of our calls. Did you have any trouble getting through to a person?"

This is not going to be a scientific survey with anything like statistically valid data. But it will do two things: give you some ideas of what your customers would like to see improved, and get your employees into the habit of listening to customer feedback.

Tips:

  1. Don't have the customer service reps ask for feedback on themselves. You won't get honest responses, and it will be very awkward.
  2. Don't expect a random sample, since CSRs won't want to ask for feedback from the unhappy customers. This is an idea-generating exercise, not a science experiment.
  3. As a team activity, have the CSRs talk about the feedback they got and what they think it means. This helps build the idea that everyone should be listening to the customer.

Idea 2: Do a survey of your callers who don't talk to a person.

In an earlier article I talked about how the IVR is a big blind spot for many companies' customer feedback programs. So doing a survey of callers who stayed in the IVR is almost guaranteed to tell you something you don't know. These should be follow-up surveys (preferably phone interviews) conducted as soon as possible after the customer hangs up (but not on the same call, since that biases the sample). Getting 500-1,000 surveys will give you a good statistical sample and the ability to get a good understanding of your different caller populations, but even 100 surveys will be enough to generate some new ideas for improving the customer experience.

Some things to ask about are:

  • Did the caller actually self-serve in the IVR, or hang up in frustration?
  • What tasks should callers be able to do in the IVR but often struggle with? What tasks do callers want to do in the IVR but aren't supported?
  • What are the barriers to either self-service (when appropriate) or getting to the right person (when self-service isn't an option)?

Many companies have the general idea that their IVR systems don't work well for customers, based on the complaints and other feedback they get. This one-time survey will quantify that, help identify which customers are having what sorts of problems, and point out ways to make it work better. Often small changes, like updating the options in a menu, can have a large effect.

Idea 3: Add some feedback to your training program.

Hearing the voice of the customer can be a powerful training tool, and the most effective way to deliver this is through the literal voice of the customer--that is, the recording of a follow-up interview with the customer.

Many contact centers already do training sessions where supervisors listen to calls with the customer service reps and offer feedback. It's easy to add customer feedback to this process. You will want to call a small sample of customers back right after their call, and ask for general feedback and suggestions. Play the recording of this interview for the CSR when reviewing the same call for training purposes.

Some tips for implementing this:

  1. You can start very small and informal, and scale it up as appropriate. To begin with it may be as informal as having a supervisor call the customer back and ask, "I'm going to be training Alice in a few minutes, do you have any suggestions?" This can grow into a program with specific questions, defined metrics, and a statistical sample.
  2. Tempting as it may be, don't make the CSR part of the call back to the customer. This will just be awkward. Don't try to get feedback during the customer service call, since you'll only get the happy customers. Call back.
  3. Have the CSR listen to the interview recording first, and then listen to the original call. That puts the agent in the shoes of the customer, and makes him or her more sensitive to how the customer viewed the call.

Pretty Good Practice: Survey About the IVR

The overwhelming majority of contact center surveys completely ignore a huge and important group of customers: those customers who never spoke to an agent.

Now, there are good reasons for focusing surveys on agent-handled calls. Often, a key purpose of the survey is measuring and training the agents, and the agent experience is likely to be less consistent than the IVR. And some contact center surveys do include a question or two asking about the automated part of the call.

But what about those customers who:

  • Never needed to speak to an agent because they could self-serve in the IVR
  • Wanted to get to an agent, but couldn't figure out how
  • Gave up waiting to speak to a person
  • Had the IVR hang up on them (yes it happens, and more often than you think)
  • Would have been willing to use the IVR, but found it confusing

Those people are your customers, too, and in many companies there are more of them than the people who actually spoke to a customer service representative.

Chances are, the experience of those customers who never left the IVR is having more of an impact on the customer satisfaction and operational cost of the call center than the customers who spoke to a person.

And most companies are ignoring these customers completely in their survey process.

This is a huge blind spot, and an enormous opportunity to improve the customer experience and save money.

Over Scripting

According to a report from Consumerist, some Best Buy customer service reps are not happy that they are being given new scripts and aren't being allowed to deviate.

Over-scripting is a problem in the customer service business, since there's a tendency for management to think they can anticipate every customer inquiry and write the perfect answer. Not true, and my observation is that most contact centers have been moving to give more flexibility to CSRs, not less.

Over-scripting causes a lot of problems:

  • It makes it much harder to establish a rapport with the customer and understand their problems.
  • It's usually obvious that the CSR is reading from a script, and customers hate it.
  • CSRs don't like being micro-managed about every word which comes out of their mouths.
  • It tends to lead the company to focus more on compliance and less on service.

Over-scripting is a really easy trap to fall into in the survey business. When performing a structured interview it really is important to be completely consistent about every word in the question. Even small word choices can shade the meaning of a question and bias the results.

As a result, many companies which do phone interviews create extremely rigid scripts. I've seen instances where the interviewer is literally not allowed to deviate even one word during the course of the entire survey--even if the participant has questions or problems (responses to the participants' questions are also scripted, of course).

That's wrong. The whole point of doing an interview (as opposed to an automated survey) is to establish a rapport with the customer, go in-depth, and leave the customer feeling that you actually listened. Over-scripting makes it impossible to do these things.

On the other hand, we really do need to have consistent questions in order to get consistent and useful data. It's not helpful to go to the other extreme--like one phone survey I encountered where the only question was, "How was your experience?" with just a single open-ended question. That format might be useful for getting some high-level qualitative feedback for training, but it's not going to yield meaningful statistics for decision-making or tracking.

The key is to strike a balance between being consistent on the survey and letting the interviewer be human. We clearly define which parts of the script need to be read exactly as written, and where an interviewer can go off-script as needed. We recognize that some questions don't need to be strictly scripted (for example, when we want to know the reason behind some answer), and we allow the interviewer to acknowledge that the survey questions are scripted (which the customer knows anyway).

This allows us to have a more meaningful dialogue with customers and leave them feeling that we really listened to them, while still getting consistent metrics for tracking and analysis by our clients.

Issue 70 of Vocalabs' Newsletter

Issue 70 of Vocalabs' newsletter, Quality Times, has been published. E-mail subscribers should be receiving it shortly.

In this issue we announce the Customer Service Survey Maturity Model, as well as an online Maturity Self-Assessment Tool. This will help you understand how mature your ongoing customer service survey efforts are, in the context of the principles of Agile Customer Feedback. The Self-Assessment tool will also offer some suggestions for practices you can adopt to get more useful, actionable customer feedback.

We also discuss collecting customer feedback in a business-to-business context--it's not as different as many people assume.

As always I hope you find this useful and informative. I welcome any comments and suggestions.

The Customer Service Survey Maturity Model

Based on our years of experience with different companies and helping them build effective customer feedback programs, we have developed the Customer Service Survey Maturity Model. Today we published a white paper outlining the model, as well as a simple online self-assessment tool which anyone can use to gain an understanding of where their customer feedback practices stand, using the principles of Agile Customer Feedback as a guide.

The maturity model and the self-assessment tool are a quick way to get a sense of what's possible and what changes to your customer survey process are likely to yield the most bang for your buck. The model evaluates the three functional areas of any customer service survey process: Data Collection, Reporting, and Process. In each area, the company's maturity is evaluated based on which practices the company uses.

Since Data Collection, Reporting, and Process all support each other, the most fertile ground for improvement is usually where the company is least mature.

We're hoping this is a useful tool for our clients and the community in general. I welcome any comments or feedback.

Extreme Reactions

Just how extreme of a reaction can you get from poor customer service?

Snopes just published a story about a bank which lost a million-dollar account and 30-year customer because it refused to validate a $0.50 parking coupon.

The incident happened in the late 1980's (when a million bucks was real money) at the Old National Bank in Spokane, WA (now part of U.S. Bank). John Barrier was in the construction business and was dressed in shabby clothes when he went to the bank to cash a check.

Both the teller and the bank manager, apparently failing to recognize the importance of the customer, refused to validate Barrier's parking coupon to save him the $0.50 it cost to park at the bank. As a result, Barrier closed his account the next day and moved to a different bank down the street.

Why such a move? We can guess it probably wasn't the fifty cents per se. From the quotes John Barrier gave in a newspaper article at the time, it sounds like he felt the bank didn't give him the basic respect he deserved as a customer: speaking of the bank manager, Barrier said, "He looked me up and down and stood back and gave me one of those kinds of looks." He was also quoted as saying, "If you have $1 in a bank or $1 million, I think they owe you the courtesy of stamping your parking ticket."

The lesson from this is that customers will sometimes have an extreme reaction to a single poor experience. If a customer feels he's been disrespected, then it becomes about something much more basic than a financial transaction. It's about dignity, social standing, and respect.

Business-to-Business Customer Feedback

The overwhelming majority of customer service surveys happen in the business-to-consumer world.

Which, if you think about it, is a little strange given that the business case for an effective customer feedback program is much stronger in the business-to-business world. In B2B, the value of each customer is usually higher, the acquisition costs higher, and the length of the relationship longer. Reputation and word-of-mouth is just as important in B2B, but it's less likely to be visible through social media and other public channels.

We have a number of very successful B2B customer feedback programs in place, and our clients have found it to be a powerful tool. Many of the tools and techniques we use in B2C customer feedback are just as useful in B2B, such as real-time interviews, delivering data to the front lines in real-time, and a robust service recovery process.

There are a few things to be aware of, though, when implementing a B2B customer feedback program:

  • Reaching the right person can be more of a challenge than in B2C. It's common to have multiple contacts for each customer and lot of out-of-date information.
  • B2B customers are very willing to talk to an interviewer for a few minutes--refusal rates are typically very low. However, you need to respect their time, and automated surveys are not likely to be successful.
  • B2B customers expect a high level of service, and if they have a complaint they expect it to be addressed. A proper service recovery process is not optional.

So don't listen to the people who think you can't do customer feedback in B2B--our clients know it isn't true.

How Not to Do a Phone Interview

I'm a big believer in using phone interviews to get real-time in-depth feedback on customer service. There's simply no better way to have an actual conversation with a customer, get him or her to open up, and find out how the experience was.

But there's a right way and a wrong way.

Over the weekend I was asked to do a survey about a phone call I made to a nurse line a couple weeks ago. Two weeks ago my wife had a bicycle accident, and I needed help deciding if she needed to go back to the doctor to look at a possibly-infected wound. She's doing much better now, thanks.

The interview was practically a case study in how not to do a customer service survey:

  • The interview took place almost two weeks after the original call. Pretty Good Practice: The faster you can reach the customer the better--calling back within a few minutes is best, though in this particular situation waiting several hours may be more appropriate. People's memories start to degrade very quickly, and after two weeks I remember very few details of my call other than the fact that it took place. That means that my answers are not going to be very detailed or specific, and I won't be able to give meaningful answers to a lot of questions.
  • At no time did the interviewer tell me how long the survey would take. Pretty Good Practice: Always inform customers how long the survey will take before asking if they'll participate. Seriously. They're doing you a favor, the least you can do is have a little respect.
  • The survey was 14 minutes long (according to the timer on my phone). Pretty Good Practice: Keep phone interviews under five minutes if at all possible. Fourteen minutes is way too long, and likely indicates the people designing the survey didn't have the discipline to cut questions which aren't useful or necessary. On the other hand, this explains why the interviewer didn't tell me how long it would take. Again, have some respect for customers and their time.
  • Many of the questions were slight variations of each other. Pretty Good Practice: Make sure different questions are really asking different things. This is where forcing yourself to write a five-minute interview script is a useful discipline. It quickly becomes obvious that you don't need to have a question about whether the nurse was "caring" and then another question about whether she was "helpful." Pick the two or three things which are most important and leave the rest off.
  • Some of the questions were confusing or didn't apply to the situation. The interviewer was not allowed to do anything but repeat the question. Pretty Good Practice: Adjust questions to make sure they make sense, and give the interviewer some latitude to help the customer understand the question. We all write bad survey questions from time to time, I get that, and you don't want interviewers going off the rails and re-interpreting the survey. But don't sacrifice accuracy on the altar of consistency, either. One of the great advantages of an interview over an automated survey is the interviewer can help a confused customer understand the intent of the question. Here's my slightly embellished recollection of how one of the questions on my interview went:

    Interviewer: How satisfied are you with the wait?
    Me: What do you mean by the wait? The wait to speak to a person?
    Interviewer: I'm only allowed to read the question, you can interpret it however you want.

    Seriously? Would it have killed them to let the interviewer tell me that the question was about how long it took for someone to answer my call? Knowing what I do about call centers and the likely purpose of the survey, I could infer that this was what the question was about. But someone else might have interpreted it to be about the wait at a clinic or something completely different, and that data would be useless. And fix the question, while you're at it.

  • There was hardly any free response on the survey. Pretty Good Practice: On a phone interview, have plenty of follow-up questions and opportunities for customers to give more detail. Another great advantage of the interview format is you can do this. Especially on such a long survey, not taking advantage of the customer's willingness to go beyond the scripted questions is a terrible waste of everyone's time.

  • The interviewer sounded bored and monotonic. Pretty Good Practice: Phone interviewers should sound engaged and try to establish a rapport with customers. I seriously did start to wonder if the interviewer was a recording, so every now and then I would toss in a question of my own to make sure I was still talking to a person. A shorter, better-designed script would have helped a lot. It also helps to hire interviewers who are naturally engaging. Who knows--this interviewer may actually sparkle most of the time, she's just been beat into submission by the awful dullness of it all.

  • Use of "Which of these statements best fits..." type questions on a phone interview. Pretty Good Practice: Don't use these on phone interviews. Really. Just don't do it. Any time I see a question like this on an interview script, I know it was written by someone who never had to actually do an interview. These questions are suitable for written surveys, where the participant can read the choices, quickly re-read the most relevant ones, and doesn't have to remember all the options. In a phone interview, it takes a long time to read the choices, and the participant can only remember so much at once. If the customer has forgotten one of the choices, the interviewer will need to re-read the option, which takes even longer. Find some other way to get the data you're looking for.

  • Asking demographic questions on a customer service survey. Pretty Good Practice: Get the demographic data somewhere else, or do without. Some researchers love their demographic questions, but experience has shown that they rarely lead to insights on a customer service survey. Surveys for other purposes are a different matter, but if you're trying to learn about a specific customer experience the demographics of the customer are usually less important than other factors like who the customer spoke to, the reason for the call, etc. There are many reasons not to ask for demographics: it takes time (in what should be a short survey), customers often find the questions intrusive, and there's a decent chance you can get the same information elsewhere.

Vocalabs Newsletter #69 is Published

Issue 69 of Quality Times, Vocalabs' newsletter on customer service surveys, has been published. This is the Compliance Issue, with two articles on upcoming regulatory changes relevant to companies collecting customer feedback. One set of changes is the new FCC regulations about robocalling, which will be relevant to anyone performing outbound IVR surveys. The other is updates to HIPAA security and privacy rules, which will impact anyone collecting customer feedback in the healthcare arena.

As always, I hope you find this interesting and informative. You can subscribe to receive newsletters via e-mail using the form on the newsletter page.

Warranty service on an Epson projector

Over the weekend my home theater projector, an Epson 8350, died. It was about 18 months old, and didn't last nearly as long as I expected. But it was a good projector, so I wasn't too upset.

When I took the projector down, I noticed a sticker with a phone number for technical support. In this day and age, I've pretty much resigned myself to the fact that almost any electronic item more than a couple months old which breaks is probably just going to get thrown away and replaced.

But on a whim I called the support number on the projector. I reached a technician with flawless English (other than an odd habit of addressing me as Sir Peter) who asked a few troubleshooting questions and took my serial number.

Then to my astonishment he told me my projector is still under warranty, and he would FedEx a replacement unit. I could send the broken one back in the same shipping box, with shipping prepaid.

So today I have a new (refurbished) projector, and a new appreciation for Epson. Thank you for standing behind your products.

More than a metaphor?

"Brand terrorist" is a phrase you sometimes hear in the customer experience world. It describes a customer who dislikes a company and actively spreads bad word-of-mouth.

I've always felt this metaphor is way over-the-top. "Terrorist" is far too extreme a word to describe someone who merely says mean things about you. Maybe others feel the same, which would epxlain why it hasn't really caught on.

On the other hand....When someone smashes a $400,000 sports car with a sledgehammer in public in order to complain about bad service, "brand terrorist" starts to feel about right.

Spring!

In the Twin Cities, we finally just got to the end of a long, cold, drawn-out late winter/early spring. Two weeks ago, there was a foot of fresh snow on the ground just south of here.

Yesterday it topped 100 degrees in a few places.

Today (and the next several days) are supposed to be perfect: sunny, mid-70's, and low humidity. The kind of day that makes you wish the office building had a retractable roof.

Here's my Big Idea for the day. When the weather is really bad--around here that means well over a foot of snow--we get a "snow day." I'm told that in some other parts of the country they get "hurricane days," but we don't have those here. Having that impromptu vacation should be fun (just ask the kids), but the problem is that as adults we can't enjoy it.

So on the first really nice day of Spring here, we should cancel school and work. Everyone could go outside and enjoy the fresh air and sunshine, get a month's worth of vitamin D, and get that day off when you can really take advantage of it.

So whaddaya think?

FCC Rule Changes and Outbound IVR Surveys

The FCC has made some rule changes which go into effect October 2013 which add new restrictions on the use of outbound IVR (aka "robocalling"). Some of these changes will affect outbound IVR surveys, so if you use this technique you should be aware of what's going on.

I'm not a lawyer, but I've been researching these changes. My summary of how this impacts outbound IVR surveys may be helpful to get the lay of the land, but if you think this may affect your business you should get legal advice from an actual lawyer.

The background is simply that consumers hate robocalls, and they have been complaining to the FCC. A lot. And when you annoy enough voters, politicians and bureaucrats tend to notice.

The old rules had some significant loopholes (aka "safe harbors") which made it fairly easy for companies to legally robocall consumers. The biggest loophole was the "established business relationship" exemption, which basically said that the rules didn't apply if there was an established business relationship. That is now gone. There is also a new, and very strict, definition of the kind of consent you need for "telemarketing" calls made using an autodialer or outbound IVR.

Under the new rules, you need:

  • Express written consent from the recipient before a telemarketing call using a robocall or autodialer. Express Written Consent has a specific definition, and is a real hurdle: basically the consumer has to sign something in writing specifically authorizing robocalls, in a way which makes it clear that's what the consumer meant to do.
  • Express consent from the recipient before making any robocall or autodialed call to an mobile phone. Express Consent isn't specifically defined in the rules, but the implication (both in the rules and in the discussion in the FCC's report) seems to be that it's supposed to be just as unambiguous to the consumer as Express Written Consent, but you can be a little more flexible about how you get it--for example, by asking over the phone. But, as I interpret the rules, you can't bury something on page 19 of your Terms of Service which would give consent for the customer to be robocalled (both because it's not prominent enough, and also because you can't make Express Consent a requirement for selling the customer any product or service).

In addition, all robocalls must identify the company placing the call right at the beginning of the call (using the company's legal name) and provide a phone number during the call.

The implications for outbound IVR surveys, as I read the rules, are as follows:

  • An outbound IVR survey to a business or landline doesn't need prior consent from the person you're calling.
  • An outbound IVR survey to a mobile phone requires express consent, even if there's an established business relationship.
  • You can be flexible about how you get consent for an IVR survey, as long as it's clear to the consumer that they're agreeing to be robocalled and you keep a record. For example, any of these would be OK:
    • A recording of the customer service rep asking the customer, "Can we have your permission to call you for an automated survey after this call?"
    • A recording of your (inbound) IVR asking the customer, "We would like to call you after this call for an automated survey. Please press one if we have your permission," followed by the DTMF digit "one" from the customer.
    • A web form where the customer checked the box next to the statement, "You may call me with an automated customer survey."
    • A written contract where the customer initialed a box next to the statement, "You may call me with an automated customer survey," as long as the customer's consent was optional.
  • Any outbound IVR survey must begin with a statement like, "We would like you to take this automated survey from XYZ Incorporated," and at some point have a statement like, "You can reach XYZ Incorporated at 800-555-5555."
  • You need to be very careful that your survey doesn't include a marketing message and therefore become "telemarketing." What's a marketing message? I don't know. If you give the customer a coupon code for taking the survey, to you that might just be a way of boosting response. But to the FCC, distributing coupons might be "marketing." I won't be the one to litigate this question.

On the whole, these rule changes will not make outbound IVR surveys impossible for companies trying to do post-call customer surveys. But they do impose some significant headaches. No longer can you hide behind the "established business relationship" loophole. You need to get specific permission from the customer to robocall him or her. Just as importantly, you need to have a record of that permission, because if there's a dispute the burden of proof is on the company to show there was express consent.

The easiest thing to do is have the CSR ask for permission for the survey. Since many companies already record calls, that takes care of the record keeping problem. But this is bad survey practice, since it gives the agent a chance to manipulate the process.

Better is to have the (inbound) IVR offer the survey. But since almost no companies currently record IVR interactions or keep their log files for an extended period of time, it may be a significant burden to maintain the proof that the customer really did consent to the survey.

Another option is to ask customers for blanket permission to do outbound IVR surveys. To make this work, there will have to be a database entry (probably in the CRM system) showing which customers consented, and IVR calls can only be placed to customers who agreed. Depending on your infrastructure, this could be simple, or it could be very complex to make it work in real-time.

The one thing which is clear (and again, please get legal advice!) is that the days of "anything goes" outbound IVR surveys are gone for good.

Customers Notice Improvement

About 18 months ago we implemented a rigorous customer feedback program for one of our clients. This is a support desk for an enterprise software company, so the client is dealing with frustrated customers who spend a lot of money.

Before our arrival, the client's feedback program was very primitive. They used SurveyMonkey to send surveys to customers after a trouble ticket was closed (if someone remembered), and had a poor response rate with feedback that wasn't very actionable.

We redesigned the process to use an immediate phone interview, integrated into the ticketing system on Salesforce.com. Our interviewers asked about several elements of the support process, with a lot of follow-up questions. We evolved the survey questions as we learned more about the customers' hot-button issues and helped the client narrow down on the root causes of customer complaints.

Just as important, we deliver the feedback directly to the support team in real-time, so they will see a customer's feedback in as little as a few minutes after closing the ticket. They can listen to the interview recording, analyze their statistics by product, problem type, and other data from Salesforce.

Based on the customer feedback the client made some process changes, and also used the data to hold one of its business partners accountable for some of the partner's problems which were showing up in the feedback.

We know all this is working, as the client's survey scores are improving (and their business is growing). But it was especially gratifying when one of the customers made a point of telling our interviewer yesterday that he noticed that the service has "improved a lot," the support engineer was "phenomenal," and that he has noticed a clear change over the past several incidents.

Customers do notice when things improve.

Getting rid of the toll-free number?

According to the LA Times, Spirit Airlines, the aggressively customer-unfriendly purveyor of cheap airfare and expensive extras, has gotten rid of its toll-free customer service numbers. Customers will now have to pay long-distance rates to Salt Lake City.

I'm not a fan of Spirit's pricing tactics. The company goes so far as to quote airfare and fuel separately, as though fuel was an optional extra and you could shop around for a better price. And according to Spirit's data, the company collects an average of $54 per flight in "optional" extras (like $35-$50 for a carry-on bag). Add $108 to a round-trip ticket, and Spirit doesn't seem like such a bargain anymore.

That said, there is some real logic to this. We no longer live in an era of $0.25/minute long distance, and many consumers get unlimited long distance bundled with their mobile phone or landline service. So why should big companies continue to pick up the tab?

On the other hand, Spirit probably pays next to nothing for long distance anyway, so why not? That's what makes me suspect this move by Spirit is more marketing artifice than actual cost savings. Like the "warehouse" grocery store which packs pallets to the ceiling to make you think you're getting a great deal, Spirit seems to go out of its way to create the illusion that its prices are lower than they actually are. Part of that illusion involves stripping out every possible customer convenience. There will always be some customers willing to do almost anything in the name of a deal, even if that deal turns out not to be such a big deal after all.

Syndicate content