The Customer Service Survey

Vocalabs' Blog


Extreme Reactions

Just how extreme of a reaction can you get from poor customer service?

Snopes just published a story about a bank which lost a million-dollar account and 30-year customer because it refused to validate a $0.50 parking coupon.

The incident happened in the late 1980's (when a million bucks was real money) at the Old National Bank in Spokane, WA (now part of U.S. Bank). John Barrier was in the construction business and was dressed in shabby clothes when he went to the bank to cash a check.

Both the teller and the bank manager, apparently failing to recognize the importance of the customer, refused to validate Barrier's parking coupon to save him the $0.50 it cost to park at the bank. As a result, Barrier closed his account the next day and moved to a different bank down the street.

Why such a move? We can guess it probably wasn't the fifty cents per se. From the quotes John Barrier gave in a newspaper article at the time, it sounds like he felt the bank didn't give him the basic respect he deserved as a customer: speaking of the bank manager, Barrier said, "He looked me up and down and stood back and gave me one of those kinds of looks." He was also quoted as saying, "If you have $1 in a bank or $1 million, I think they owe you the courtesy of stamping your parking ticket."

The lesson from this is that customers will sometimes have an extreme reaction to a single poor experience. If a customer feels he's been disrespected, then it becomes about something much more basic than a financial transaction. It's about dignity, social standing, and respect.

Business-to-Business Customer Feedback

The overwhelming majority of customer service surveys happen in the business-to-consumer world.

Which, if you think about it, is a little strange given that the business case for an effective customer feedback program is much stronger in the business-to-business world. In B2B, the value of each customer is usually higher, the acquisition costs higher, and the length of the relationship longer. Reputation and word-of-mouth is just as important in B2B, but it's less likely to be visible through social media and other public channels.

We have a number of very successful B2B customer feedback programs in place, and our clients have found it to be a powerful tool. Many of the tools and techniques we use in B2C customer feedback are just as useful in B2B, such as real-time interviews, delivering data to the front lines in real-time, and a robust service recovery process.

There are a few things to be aware of, though, when implementing a B2B customer feedback program:

  • Reaching the right person can be more of a challenge than in B2C. It's common to have multiple contacts for each customer and lot of out-of-date information.
  • B2B customers are very willing to talk to an interviewer for a few minutes--refusal rates are typically very low. However, you need to respect their time, and automated surveys are not likely to be successful.
  • B2B customers expect a high level of service, and if they have a complaint they expect it to be addressed. A proper service recovery process is not optional.

So don't listen to the people who think you can't do customer feedback in B2B--our clients know it isn't true.

How Not to Do a Phone Interview

I'm a big believer in using phone interviews to get real-time in-depth feedback on customer service. There's simply no better way to have an actual conversation with a customer, get him or her to open up, and find out how the experience was.

But there's a right way and a wrong way.

Over the weekend I was asked to do a survey about a phone call I made to a nurse line a couple weeks ago. Two weeks ago my wife had a bicycle accident, and I needed help deciding if she needed to go back to the doctor to look at a possibly-infected wound. She's doing much better now, thanks.

The interview was practically a case study in how not to do a customer service survey:

  • The interview took place almost two weeks after the original call. Pretty Good Practice: The faster you can reach the customer the better--calling back within a few minutes is best, though in this particular situation waiting several hours may be more appropriate. People's memories start to degrade very quickly, and after two weeks I remember very few details of my call other than the fact that it took place. That means that my answers are not going to be very detailed or specific, and I won't be able to give meaningful answers to a lot of questions.
  • At no time did the interviewer tell me how long the survey would take. Pretty Good Practice: Always inform customers how long the survey will take before asking if they'll participate. Seriously. They're doing you a favor, the least you can do is have a little respect.
  • The survey was 14 minutes long (according to the timer on my phone). Pretty Good Practice: Keep phone interviews under five minutes if at all possible. Fourteen minutes is way too long, and likely indicates the people designing the survey didn't have the discipline to cut questions which aren't useful or necessary. On the other hand, this explains why the interviewer didn't tell me how long it would take. Again, have some respect for customers and their time.
  • Many of the questions were slight variations of each other. Pretty Good Practice: Make sure different questions are really asking different things. This is where forcing yourself to write a five-minute interview script is a useful discipline. It quickly becomes obvious that you don't need to have a question about whether the nurse was "caring" and then another question about whether she was "helpful." Pick the two or three things which are most important and leave the rest off.
  • Some of the questions were confusing or didn't apply to the situation. The interviewer was not allowed to do anything but repeat the question. Pretty Good Practice: Adjust questions to make sure they make sense, and give the interviewer some latitude to help the customer understand the question. We all write bad survey questions from time to time, I get that, and you don't want interviewers going off the rails and re-interpreting the survey. But don't sacrifice accuracy on the altar of consistency, either. One of the great advantages of an interview over an automated survey is the interviewer can help a confused customer understand the intent of the question. Here's my slightly embellished recollection of how one of the questions on my interview went:

    Interviewer: How satisfied are you with the wait?
    Me: What do you mean by the wait? The wait to speak to a person?
    Interviewer: I'm only allowed to read the question, you can interpret it however you want.

    Seriously? Would it have killed them to let the interviewer tell me that the question was about how long it took for someone to answer my call? Knowing what I do about call centers and the likely purpose of the survey, I could infer that this was what the question was about. But someone else might have interpreted it to be about the wait at a clinic or something completely different, and that data would be useless. And fix the question, while you're at it.

  • There was hardly any free response on the survey. Pretty Good Practice: On a phone interview, have plenty of follow-up questions and opportunities for customers to give more detail. Another great advantage of the interview format is you can do this. Especially on such a long survey, not taking advantage of the customer's willingness to go beyond the scripted questions is a terrible waste of everyone's time.

  • The interviewer sounded bored and monotonic. Pretty Good Practice: Phone interviewers should sound engaged and try to establish a rapport with customers. I seriously did start to wonder if the interviewer was a recording, so every now and then I would toss in a question of my own to make sure I was still talking to a person. A shorter, better-designed script would have helped a lot. It also helps to hire interviewers who are naturally engaging. Who knows--this interviewer may actually sparkle most of the time, she's just been beat into submission by the awful dullness of it all.

  • Use of "Which of these statements best fits..." type questions on a phone interview. Pretty Good Practice: Don't use these on phone interviews. Really. Just don't do it. Any time I see a question like this on an interview script, I know it was written by someone who never had to actually do an interview. These questions are suitable for written surveys, where the participant can read the choices, quickly re-read the most relevant ones, and doesn't have to remember all the options. In a phone interview, it takes a long time to read the choices, and the participant can only remember so much at once. If the customer has forgotten one of the choices, the interviewer will need to re-read the option, which takes even longer. Find some other way to get the data you're looking for.

  • Asking demographic questions on a customer service survey. Pretty Good Practice: Get the demographic data somewhere else, or do without. Some researchers love their demographic questions, but experience has shown that they rarely lead to insights on a customer service survey. Surveys for other purposes are a different matter, but if you're trying to learn about a specific customer experience the demographics of the customer are usually less important than other factors like who the customer spoke to, the reason for the call, etc. There are many reasons not to ask for demographics: it takes time (in what should be a short survey), customers often find the questions intrusive, and there's a decent chance you can get the same information elsewhere.

Vocalabs Newsletter #69 is Published

Issue 69 of Quality Times, Vocalabs' newsletter on customer service surveys, has been published. This is the Compliance Issue, with two articles on upcoming regulatory changes relevant to companies collecting customer feedback. One set of changes is the new FCC regulations about robocalling, which will be relevant to anyone performing outbound IVR surveys. The other is updates to HIPAA security and privacy rules, which will impact anyone collecting customer feedback in the healthcare arena.

As always, I hope you find this interesting and informative. You can subscribe to receive newsletters via e-mail using the form on the newsletter page.

Warranty service on an Epson projector

Over the weekend my home theater projector, an Epson 8350, died. It was about 18 months old, and didn't last nearly as long as I expected. But it was a good projector, so I wasn't too upset.

When I took the projector down, I noticed a sticker with a phone number for technical support. In this day and age, I've pretty much resigned myself to the fact that almost any electronic item more than a couple months old which breaks is probably just going to get thrown away and replaced.

But on a whim I called the support number on the projector. I reached a technician with flawless English (other than an odd habit of addressing me as Sir Peter) who asked a few troubleshooting questions and took my serial number.

Then to my astonishment he told me my projector is still under warranty, and he would FedEx a replacement unit. I could send the broken one back in the same shipping box, with shipping prepaid.

So today I have a new (refurbished) projector, and a new appreciation for Epson. Thank you for standing behind your products.

More than a metaphor?

"Brand terrorist" is a phrase you sometimes hear in the customer experience world. It describes a customer who dislikes a company and actively spreads bad word-of-mouth.

I've always felt this metaphor is way over-the-top. "Terrorist" is far too extreme a word to describe someone who merely says mean things about you. Maybe others feel the same, which would epxlain why it hasn't really caught on.

On the other hand....When someone smashes a $400,000 sports car with a sledgehammer in public in order to complain about bad service, "brand terrorist" starts to feel about right.


In the Twin Cities, we finally just got to the end of a long, cold, drawn-out late winter/early spring. Two weeks ago, there was a foot of fresh snow on the ground just south of here.

Yesterday it topped 100 degrees in a few places.

Today (and the next several days) are supposed to be perfect: sunny, mid-70's, and low humidity. The kind of day that makes you wish the office building had a retractable roof.

Here's my Big Idea for the day. When the weather is really bad--around here that means well over a foot of snow--we get a "snow day." I'm told that in some other parts of the country they get "hurricane days," but we don't have those here. Having that impromptu vacation should be fun (just ask the kids), but the problem is that as adults we can't enjoy it.

So on the first really nice day of Spring here, we should cancel school and work. Everyone could go outside and enjoy the fresh air and sunshine, get a month's worth of vitamin D, and get that day off when you can really take advantage of it.

So whaddaya think?

FCC Rule Changes and Outbound IVR Surveys

The FCC has made some rule changes which go into effect October 2013 which add new restrictions on the use of outbound IVR (aka "robocalling"). Some of these changes will affect outbound IVR surveys, so if you use this technique you should be aware of what's going on.

I'm not a lawyer, but I've been researching these changes. My summary of how this impacts outbound IVR surveys may be helpful to get the lay of the land, but if you think this may affect your business you should get legal advice from an actual lawyer.

The background is simply that consumers hate robocalls, and they have been complaining to the FCC. A lot. And when you annoy enough voters, politicians and bureaucrats tend to notice.

The old rules had some significant loopholes (aka "safe harbors") which made it fairly easy for companies to legally robocall consumers. The biggest loophole was the "established business relationship" exemption, which basically said that the rules didn't apply if there was an established business relationship. That is now gone. There is also a new, and very strict, definition of the kind of consent you need for "telemarketing" calls made using an autodialer or outbound IVR.

Under the new rules, you need:

  • Express written consent from the recipient before a telemarketing call using a robocall or autodialer. Express Written Consent has a specific definition, and is a real hurdle: basically the consumer has to sign something in writing specifically authorizing robocalls, in a way which makes it clear that's what the consumer meant to do.
  • Express consent from the recipient before making any robocall or autodialed call to an mobile phone. Express Consent isn't specifically defined in the rules, but the implication (both in the rules and in the discussion in the FCC's report) seems to be that it's supposed to be just as unambiguous to the consumer as Express Written Consent, but you can be a little more flexible about how you get it--for example, by asking over the phone. But, as I interpret the rules, you can't bury something on page 19 of your Terms of Service which would give consent for the customer to be robocalled (both because it's not prominent enough, and also because you can't make Express Consent a requirement for selling the customer any product or service).

In addition, all robocalls must identify the company placing the call right at the beginning of the call (using the company's legal name) and provide a phone number during the call.

The implications for outbound IVR surveys, as I read the rules, are as follows:

  • An outbound IVR survey to a business or landline doesn't need prior consent from the person you're calling.
  • An outbound IVR survey to a mobile phone requires express consent, even if there's an established business relationship.
  • You can be flexible about how you get consent for an IVR survey, as long as it's clear to the consumer that they're agreeing to be robocalled and you keep a record. For example, any of these would be OK:
    • A recording of the customer service rep asking the customer, "Can we have your permission to call you for an automated survey after this call?"
    • A recording of your (inbound) IVR asking the customer, "We would like to call you after this call for an automated survey. Please press one if we have your permission," followed by the DTMF digit "one" from the customer.
    • A web form where the customer checked the box next to the statement, "You may call me with an automated customer survey."
    • A written contract where the customer initialed a box next to the statement, "You may call me with an automated customer survey," as long as the customer's consent was optional.
  • Any outbound IVR survey must begin with a statement like, "We would like you to take this automated survey from XYZ Incorporated," and at some point have a statement like, "You can reach XYZ Incorporated at 800-555-5555."
  • You need to be very careful that your survey doesn't include a marketing message and therefore become "telemarketing." What's a marketing message? I don't know. If you give the customer a coupon code for taking the survey, to you that might just be a way of boosting response. But to the FCC, distributing coupons might be "marketing." I won't be the one to litigate this question.

On the whole, these rule changes will not make outbound IVR surveys impossible for companies trying to do post-call customer surveys. But they do impose some significant headaches. No longer can you hide behind the "established business relationship" loophole. You need to get specific permission from the customer to robocall him or her. Just as importantly, you need to have a record of that permission, because if there's a dispute the burden of proof is on the company to show there was express consent.

The easiest thing to do is have the CSR ask for permission for the survey. Since many companies already record calls, that takes care of the record keeping problem. But this is bad survey practice, since it gives the agent a chance to manipulate the process.

Better is to have the (inbound) IVR offer the survey. But since almost no companies currently record IVR interactions or keep their log files for an extended period of time, it may be a significant burden to maintain the proof that the customer really did consent to the survey.

Another option is to ask customers for blanket permission to do outbound IVR surveys. To make this work, there will have to be a database entry (probably in the CRM system) showing which customers consented, and IVR calls can only be placed to customers who agreed. Depending on your infrastructure, this could be simple, or it could be very complex to make it work in real-time.

The one thing which is clear (and again, please get legal advice!) is that the days of "anything goes" outbound IVR surveys are gone for good.

Customers Notice Improvement

About 18 months ago we implemented a rigorous customer feedback program for one of our clients. This is a support desk for an enterprise software company, so the client is dealing with frustrated customers who spend a lot of money.

Before our arrival, the client's feedback program was very primitive. They used SurveyMonkey to send surveys to customers after a trouble ticket was closed (if someone remembered), and had a poor response rate with feedback that wasn't very actionable.

We redesigned the process to use an immediate phone interview, integrated into the ticketing system on Our interviewers asked about several elements of the support process, with a lot of follow-up questions. We evolved the survey questions as we learned more about the customers' hot-button issues and helped the client narrow down on the root causes of customer complaints.

Just as important, we deliver the feedback directly to the support team in real-time, so they will see a customer's feedback in as little as a few minutes after closing the ticket. They can listen to the interview recording, analyze their statistics by product, problem type, and other data from Salesforce.

Based on the customer feedback the client made some process changes, and also used the data to hold one of its business partners accountable for some of the partner's problems which were showing up in the feedback.

We know all this is working, as the client's survey scores are improving (and their business is growing). But it was especially gratifying when one of the customers made a point of telling our interviewer yesterday that he noticed that the service has "improved a lot," the support engineer was "phenomenal," and that he has noticed a clear change over the past several incidents.

Customers do notice when things improve.

Getting rid of the toll-free number?

According to the LA Times, Spirit Airlines, the aggressively customer-unfriendly purveyor of cheap airfare and expensive extras, has gotten rid of its toll-free customer service numbers. Customers will now have to pay long-distance rates to Salt Lake City.

I'm not a fan of Spirit's pricing tactics. The company goes so far as to quote airfare and fuel separately, as though fuel was an optional extra and you could shop around for a better price. And according to Spirit's data, the company collects an average of $54 per flight in "optional" extras (like $35-$50 for a carry-on bag). Add $108 to a round-trip ticket, and Spirit doesn't seem like such a bargain anymore.

That said, there is some real logic to this. We no longer live in an era of $0.25/minute long distance, and many consumers get unlimited long distance bundled with their mobile phone or landline service. So why should big companies continue to pick up the tab?

On the other hand, Spirit probably pays next to nothing for long distance anyway, so why not? That's what makes me suspect this move by Spirit is more marketing artifice than actual cost savings. Like the "warehouse" grocery store which packs pallets to the ceiling to make you think you're getting a great deal, Spirit seems to go out of its way to create the illusion that its prices are lower than they actually are. Part of that illusion involves stripping out every possible customer convenience. There will always be some customers willing to do almost anything in the name of a deal, even if that deal turns out not to be such a big deal after all.

Newsletter #68 is published

Issue 68 of Vocalabs' newsletter, Quality Times, has been published and is going out to e-mail subscribers now.

In this issue we discuss our just-published white paper: Agile Customer Feedback: Pretty Good Practices. We also muse a little on the changing role of the CSR.

As always, I hope you find this interesting and informative.

Give Us Five Stars for Discounted Pizza

Consumerist has the latest example in a long string of companies bribing customers for good survey scores. A Pizza Hut franchise taped a note to a pizza box offering customers a $1 discount if they bring in proof that they took the survey, and noting that "Only 5's count."

[To be fair to the Pizza Hut--it's not clear that customers would only get the discount for giving the restaurant perfect scores. But the flier could certainly be interpreted that way, and I'm sure many customers did.]

This kind of thing will continue to happen as long as companies keep setting up incentives to improve survey scores without also having strong controls on the survey process. There are a zillion ways to manipulate customer surveys, from blatant (telling customers that "only 5's count") to subtle (selectively reminding the happy customers to take the survey).

Paying employees to deliver better survey scores is almost always going to give better surveys scores--but it won't necessarily give happier customers.

Being Virtual

Last night I got around a foot of snow at my home. Unusual for this late is the season, but not impossible for Minnesota. School started two hours late (what, you think we'd cancel school over a mere 12" of snow?), so I worked from home for part of the morning.

And that's about all that happened.

At Vocalabs, we don't have a physical call center for our phone interviews. Instead, we contract with interviewers all around the country. This lets us work with some of the best interviewers around and be extremely flexible when our clients' needs change.

Just as important, though, it means that days like today are pretty much non-events. Our clients count on us to be collecting customer feedback every day, so they can use it for training, setting goals, and managing their customer experience.

A day like today would have been a mess for any company relying on a bricks-and-mortal call center. People would show up late (or not at all), schedules would be out of whack, client projects would be set behind, and we'd likely be pulling out a big 3-ring binder labeled "Snow Day Contingency Plan."

Instead....ho-hum. As long as the data center is physically intact with power and connectivity, we can keep doing interviews. Since most of the staff can also work remotely, about the only difference is it takes longer to get the mail (postal that is).

I really have nothing more profound to offer, just an observation of how the world has changed in the last 10-15 years.

The Assembled Grill

Assembling a new gas grill can be a pain, and for many customers, it's a definite plus when the store will assemble it for you. So many stores are now offering free assembly as a way to sell more gas grills and provide a better customer experience.

But free assembly isn't what every customer wants. The problem with gas grills in particular is that once assembled, the grill is a lot bigger than it was in the box. So some customers quite reasonably prefer to take the grill home unassembled and put it together themselves, because they couldn't fit the assembled product in their car.

Unfortunately, that's not how some stores view things. As several articles from the Consumerist from 20132012, and 2011 attest, shoppers at Sears, Home Depot, and Sears have all encountered situations where they needed to buy a grill still in the box, but every single grill in the store's inventory had already been assembled. The store had no unassembled grills to sell.

In this situation, what was supposed to be a benefit to the customer can be transformed into a major headache. Worse, in the case of the Home Depot customer, the customer feels like the store is making a sneaky ploy to push truck rental.

These stories illustrate the pitfall of assuming that every customer wants the same thing. The three stores all assumed that every customer would prefer their new grill to be fully assembled, and didn't anticipate that some customers would have different needs. They had what they thought was the right impulse--better customer service!--but managed to turn it into an expensive liability.

The lesson should be obvious: different customers have different needs, and the customer experience which delights one customer may be the same experience which annoys another. Your customer experience has to serve different populations of customers, pay attention to what each customer needs, and have the flexibility to adjust when customers require something unexpected.

Packing my bags for Mobile Voice

I'll be presenting at the Mobile Voice conference next week, as I have done for the past several years. If you're planning to attend (or just in the San Francisco area) please stop by and say Hi!

HIPAA Compliance and Customer Feedback

Vocalabs has several healthcare-related clients, so we are used to dealing with the privacy and security requirements of HIPAA. Some recent changes to the regulations will mean significant new requirements for what a company like us needs to do to remain HIPAA compliant after September 2013.

Since Vocalabs itself is not a healthcare company, we are not what's called a "covered entity" under the regulations. Rather, we are a "business associate," which is basically any company which a covered entity hires to perform some work which may require sharing protected health information.

Many non-healthcare companies hired by a covered entity would also be considered business associates--for example: accountants, IT services, lawyers, business process consultants, etc.

Under the old rules, a business associate had to sign a contract with the covered entity that basically promised to keep protected health information private and secure. Business associates had to maintain the same level of privacy and security as the covered entity, but did not have to go through the formal documentation and review process.

After September, though, business associates have to follow all the security rules as a covered entity (at least insofar as they can reasonably be applied) and produce the same formal documentation and policies. What's more, to the extent that a business associate subcontracts to a third party which may also receive protected health information, that subcontractor also has to comply with all the policy and documentation requirements.

These new requirements can potentially be a big problem for some survey companies. At Vocalabs, our existing policies and processes are already consistent with HIPAA requirements, so for us it will be mostly a matter of documenting and formalizing what we already do. But at companies which aren't as security-minded, the HIPAA changes could require large investments in infrastructure, training, and compliance.

So how does all this apply to Customer Feedback?

Keep in mind that the HIPAA rules only apply to "protected health information," which has a very specific legal definition. It's basically health- and care-related information created by a healthcare company (doctor, hospital, insurance company, etc.) which can be tied to a specific, identifiable patient. Customer feedback is not, by itself, protected health information.

But sometimes we need to have protected health information in order to gather useful feedback. For example, we need to know the patient's phone number to call him or her, and that phone number combined with the fact that there had been a hospital visit could arguably qualify as "protected health information." So to be on the safe side, we will treat it as PHI. For analysis purposes, we may also want to know the doctor's name, hospital, or other details which can really help understand how to improve the patient's experience but which clearly need to be protected.

So between now and September we will be updating our security and privacy policies, revising contracts, and doing everything we need to do to remain fully HIPAA compliant under the new rules. And anyone else collecting customer feedback around healthcare will need to do the same.

Some. Passengers. We. Just. Can't. Move.

Airlines have inflicted so many annoyances on their customers--intentionally or through incompetence--that it's almost refreshing to read about a situation which wasn't actually caused by the airline.

The story is that a passenger flying business-class internationally with his wife on United Airlines did everything right to make sure he got the seats he wanted: he booked well in advance, got seats together, checked in early, and had super-elite status on United.

As they were boarding, though, the wife was pulled aside and given a new seat assignment so the couple would no longer be seated together. Naturally annoyed, the customer asked for an explanation and was only told that "there are some passengers we can't move."

The flight attendants also seemed confused by the situation, but wouldn't provide any explanation other than "Some passengers we just can't move."

Eventually the passenger took the hint and figured out that (spoiler alert!) the wife's seat had been claimed by an air marshal, and of course the crew isn't supposed to reveal the presence of an air marshal on board. And the couple did manage to arrange a seat swap so they could sit together.

But even though the airline didn't create the situation and wasn't allowed to explain it, it seems that pulling the wife out of the boarding line was about the worst possible way to handle this. At a minimum the gate agent could have paged them before boarding and--without explaining the reason--told them that it was necessary to reseat the wife. Even better, the gate agent could have made some effort to arrange it so they could still sit together.

The lesson, I think, is that even when a company is placed in an unusual and difficult position, there is still a choice about how you want to treat your customers.

We Can Record You, but You Can't Record Us

Via Techdirt today, a mildly amusing recording of a customer taunting a Time Warner Cable CSR by saying he's recording the call.

In the recording the customer begins by telling the CSR that he's recording. The CSR, no doubt following TWC's written policies, says he doesn't consent to the recording. The customer asks how that can be given that TWC is itself recording the call.

Unfortunately the CSR is caught in the middle--as everyone (except maybe TWC's lawyers) understands, the policy is absurd. But the CSR isn't allowed to deviate, and can't think of a rational reason why the customer shouldn't record the call, and there you go.

What this really points out, though, is the sheer nuttiness of these "We will record you, but not give permission for you to record us" policies. Anecdotally, I know that many large companies have these policies. My guess is that the underlying reason, more than anything else, is a vague discomfort with the general idea of being recorded without permission (dressed up in language about "respecting the CSR's privacy" and/or "protecting us from liability").

But let's consider just how many different kinds of crazy this policy is:

  1. It treats the customer as implicitly untrustworthy, and not deserving of the same rights the company claims for itself.
  2. Withholding consent probably has no legal effect. Most states permit people to record phone calls without the consent of the other party; and even in states which require consent of both parties, the company has arguably consented to recording by collecting its own recording.
  3. It makes it seem that the company has something to hide.
  4. The only real downside to allowing the customer to record the call is that the company's incompetence (or even misconduct) might be exposed. See #3 above.
  5. It implicitly assumes that companies have a greater right to privacy than consumers. Most people assume the opposite should be true.

So what should a CSR do when a customer says the call is being recorded?

How about this: "Very good, and thanks for letting me know. How can I help you today?"

Vocalabs Newsletter #67 Published

Issue 67 of Vocalabs' NewsletterQuality Times, has been published. In this issue we discuss a couple of topics around closed-loop customer feedback: how the survey itself needs to be part of the closed loop, and how the full process is more important than the metrics.

As always, I hope you find this interesting and informative. E-mail subscribers should be receiving their copies shortly.

Making Surveys Predictive

There's a simple but powerful technique I think should be part of every customer survey to make it much more valuable for business decisions: every customer survey should be linked to a record of the customer's buying behavior.

Most companies already have this data available, and some companies are making a significant effort on "big data" analysis projects to try to tease out what it all means.

Taking the small extra step of including this data in the customer survey report makes use of the fact that, if you have a customer survey, your customers are already telling you how they feel about you. In many ways that's a lot easier than hunting for subtle statistical clues in a tsunami of behavior.

For example, one of our clients found that, compared to "Very Satisfied" customers, customers who were "Somewhat Satisfied" or worse with a customer service call were about 4x more likely to take their business elsewhere within the next six months.

That's not a small difference. Those customers are telling you directly that they are not loyal. Chances are, if you dig even a little you will find that they also told you (directly, in response to your survey question) why.

Are you listening?

Closing the Loop in Closed Loop Customer Feedback

Closed Loop is a popular buzzword for customer feedback programs today. It means implementing a formal process whereby customer feedback is used to drive change in the company in order to improve customer feedback.

This kind of cycle is a powerful tool--in fact, I would argue that it's pretty much the only way to build an effective and useful customer feedback process. After all, if the customer feedback isn't being used to drive change in the company, then what's it good for?

But most advocates of a closed loop process aren't really closing the loop. That's because most supposedly closed loop customer feedback processes don't formally consider the customer survey as part of the "loop" to be "closed." In other words, the company never reconsiders whether the survey is asking the right questions of the right people. It is just assumed that the survey should keep doing the same thing no matter what.

In fact, some of the proponents of closed loop processes actually advocate for very rigid and inflexible customer surveys: specific questions, with very particular sampling methods, applied in exactly the same way for every company.

That kind of rigidity (proponents would call it "consistency") is important for cross-company and cross-industry benchmarking, but it's not very helpful when trying to improve a particular customer experience at a particular company. It even sounds a little crazy to suggest that the customer survey which will yield useful, actionable feedback from a customer logging on to a bank website is the exact same customer survey which gives useful, actionable feedback from a customer buying a used car.

Instead, the customer survey itself needs to be part of the closed loop process. As companies examine their feedback they should constantly be asking questions like:

  • Are we asking the right questions?
  • Are we targeting the right customers for the survey?
  • Is this survey yielding useful data?
  • Are there issues we should be exploring in more depth on the survey?
  • Are the right people getting the right data from the survey to make the best use of it?

The questions to ask, customers to talk to, and how to use the survey data will all change constantly as the company and its business environment evolves. The survey which works today isn't likely to be optimal a year from now, and five years from now it may seem close to irrelevant.

So when you build your closed loop customer feedback process, remember that the customer feedback also needs to change with the company.

Can't Buy me Like

Bob Garfield, NPR media guru, has a new book out called "Can't Buy me Like." I haven't read it, but my guess is it's worth $12.99 just for the title alone.

Rather than plug the book, though, I'm going to plug the 25-minute segment he did yesterday for Talk of the Nation. It doesn't break any new ground (at least for those of us who have been around long enough to remember the Cluetrain Manifesto), but he makes an excellent argument for why companies need to focus more on providing genuine customer experiences and less on mass media advertising.

Plus, you can listen to the interview for free.

Metrics are Less Important than Process

I see a lot of people spend a lot of time and effort making sure they focus on the "right" metric in their customer feedback program. Net promoter? Customer Satisfaction? Customer Effort? Something else? All of the above?

There's often a lot less thought going into what to do with the survey data: who should get it, how often, how will it be used, how will people be coached and compensated, how will you follow up with customers, and so forth.

This is a big mistake, since most of the value of a successful customer feedback program comes from all the things that happen after the data is collected. In my not-very-scientific estimation, I would say that 75% of the value is in the process, with only 25% from the metrics.

I could be wrong, though: I would believe that as much as 90% of the value is in the process.

Even a mediocre metric is going to allow a lot of improvement in a well-designed closed-loop feedback process. But no metric, no matter how good, is going to drive change if there's no follow-through.

So why so much attention to the metric, and not everything else?

I think it's because deciding what metrics to include on a customer survey feels important but doesn't require much hard work. Everyone can advocate for their favorite survey questions, there can be a lively discussion, and an executive can weigh in to break the tie. You get to feel like you had a productive day.

But getting everything else right takes a lot more time, effort, and attention. A successful customer feedback program needs to be actively managed, it requires ongoing executive attention and support, and it requires constant tweaking to adjust to the changing dynamics of the business. You can't just throw money at it.

Most people think that the scarcest resource at most companies is money. That's often true in small companies, but most large companies have plenty of money to invest in the things they think are most important. In big companies, often the hardest resource to obtain is attention--especially executive attention.

So it's easier to pretend that the customer feedback program doesn't need any ongoing attention, that you can make the big decisions once and be done with them.

But that's not a recipe for success. Success requires focusing on the rest of the customer feedback process, and making it an ongoing priority. That's hard. And that's why people spend so much time worrying about the metric.

Rethinking the CSR

Traditionally, the job of the Customer Service Representative (CSR) was to take requests and orders from customers and generally handle transactions as cost-effectively as possible.

Today, most routine transactions are handled through self-service. Most customers prefer to use a company's website instead of calling on the phone, especially for simple stuff. It is more and more often the case that the CSR is handling complicated transactions, situations where the self-service didn't work, and cases where the customer needs to be doubly certain that his problem will be taken care of.

What will the job of the CSR be like in the future, when nearly all customers take care of their business online? CSRs will be left with nothing but the more complex and high-stakes problems. I think this will lead to the job becoming less the traditional CSR, and more like a Customer Advocate.

The difference is that where the CSR represents the company to the customer, the role of a Customer Advocate is to represent the customer within the company. For example:

Types of Interactions

  • Customer Service Representative: Most transactions are routine, and this is the customer's first attempt to solve the problem.
  • Customer Advocate: Most transactions are exceptions to the normal process, and the customer has already tried other ways to solve the problem.

When Multiple Calls Are Required

  • Customer Service Representative: Different CSRs will handle the customer's multiple calls. Each CSR will have to take time to become familiar with the customer's case.
  • Customer Advocate: The same Customer Advocate will keep working with the customer until the problem is solved (or it becomes clear no resolution is possible).

Measuring Performance

  • Customer Service Representative: Measured on how efficiently the CSR can handle a large volume of transactions.
  • Customer Advocate: Measured on how effectively the Customer Advocate finds solutions which are acceptable to both the customer and the company.

When a Customer Wants Special Treatment

  • Customer Service Representative: Will generally enforce company policy, but may be empowered to make limited exceptions.
  • Customer Advocate: Will explain the policy to the customer, and help argue the customer's case for an exception. Authority to enforce policy and make exceptions resides elsewhere in the organization.

General Role

  • Customer Service Representative: Interfaces between the customer and the company's internal business processes (order entry, billing, etc.).
  • Customer Advocate: Interfaces between the customer and the company's internal structure and decision-making processes (management, other organizational silos, etc.).

It's still going to be a long time before the call center's job is primarily handling these more complex interactions, but it is starting to happen today. I'm seeing more companies moving away from efficiency-based metrics (like calls per hour) and towards outcome-based metrics (resolution, satisfaction, and related metrics). I'm also seeing more companies questioning the assumption that most calls are coming from customers who haven't attempted self-service.

So the role of the CSR is clearly shifting, whether the job description is or not.

Our Valued Customers

Never publically acknowledged, it can be found in many, if not most, call centers. Passed furtively from employee to employee over the years, it might get dragged out when the office party gets really wild.

Yes, I'm talking about the secret recording of the "best" customer calls. And by "best," of course, I really mean "entertainingly worst."

Our Valued Customers takes it a step further. Tim Chamberlain works at a comic book store, and he takes it upon himself to illustrate and save for posterity many of the weird and whacky things he hears customers saying.

Just one more reason to always be polite, just like your mother taught you.

Syndicate content