The Customer Service Survey

Vocalabs' Blog

Blogs

Survey Fail

This picture (which comes to me from Failblog) nicely encapsulates two of the key challenges in collecting customer feedback:

First, you need to make sure your customers can actually take the survey. Many survey methodologies systematically exclude some segment of your customer base. Unless you account for that there's the risk you could be overlooking something important.

Second, you need to make sure the survey itself doesn't bias the responses, either through poor questions, or because the process (like in this case) annoys your customers.

For what it's worth, I had my own problems with an AT&T text message survey last year.

Related Resources

>> This Three Question Survey Takes Three Hours (and really has four questions)

>> Process Bias

>> Agile Customer Feedback

Easy Ideas for Getting More from your Customer Service Survey

A few weeks ago we released the Customer Service Survey Maturity Model, and along with it, an online self-assessment tool to find out where your feedback program fits into the maturity spectrum.

Maybe I should have expected this, but I was surprised to see several people use the self-assessment tool as a way to generate some ideas for improving their survey programs. In the self-assessment we ask about a few dozen different "pretty good practices," and that got some mental gears turning.

The great thing about using the tool this way is that you can take the self-assessment as many times as you want. So you can preview what kind of feedback program you would have if you adopted certain practices.

So I want to encourage everyone to give this a try. It's quick, easy, and free, and you'll probably come out of it with some things you can be doing to get more from your customer feedback program.

>> Take the Customer Service Survey Maturity Self-Assessment

Three Ideas to Get Started with Customer Service Feedback

I often get asked how to get started with a customer feedback program, when the organization doesn't have any surveys or other feedback processes around the customer experience. Our Customer Service Survey Maturity Model is a useful tool for understanding what's possible and how to get the most out of a feedback program, but most companies don't want to dive right into a maturity level four program on day one. That's a lot of commitment (and culture change), and it usually makes sense to start smaller and work up.

So what's the best way to get started?

I see a lot of companies which decide they want to start a voice of the customer program, but don't give much consideration as to why or what they hope to accomplish.

In keeping with the principles of Agile Customer Feedback, even a "starter program" should address current business needs, respect and listen to customers, and be designed to tell you something you don't already know.

There are lots of ways to get your feet wet in customer feedback. Here are three ways you can start collecting customer feedback in a customer service environment which are likely to be easy and show a quick return on your investment of money and effort.

Idea 1: Start asking a "Question of the Week" on customer service calls.

Customer feedback can take many forms, and it doesn't have to be a formal survey program. You can start by having customer service reps ask a "question of the week" at the end of calls and making a note the answer. For example:

  • "One last thing before we hang up. We're looking for ways to improve our customer service. Do you have any suggestions?"
  • "Before we go, we're looking for ideas to improve our website. Is there anything you wish our website did better?"
  • "One quick question. We're trying to improve the automated part of our calls. Did you have any trouble getting through to a person?"

This is not going to be a scientific survey with anything like statistically valid data. But it will do two things: give you some ideas of what your customers would like to see improved, and get your employees into the habit of listening to customer feedback.

Tips:

  1. Don't have the customer service reps ask for feedback on themselves. You won't get honest responses, and it will be very awkward.
  2. Don't expect a random sample, since CSRs won't want to ask for feedback from the unhappy customers. This is an idea-generating exercise, not a science experiment.
  3. As a team activity, have the CSRs talk about the feedback they got and what they think it means. This helps build the idea that everyone should be listening to the customer.

Idea 2: Do a survey of your callers who don't talk to a person.

In an earlier article I talked about how the IVR is a big blind spot for many companies' customer feedback programs. So doing a survey of callers who stayed in the IVR is almost guaranteed to tell you something you don't know. These should be follow-up surveys (preferably phone interviews) conducted as soon as possible after the customer hangs up (but not on the same call, since that biases the sample). Getting 500-1,000 surveys will give you a good statistical sample and the ability to get a good understanding of your different caller populations, but even 100 surveys will be enough to generate some new ideas for improving the customer experience.

Some things to ask about are:

  • Did the caller actually self-serve in the IVR, or hang up in frustration?
  • What tasks should callers be able to do in the IVR but often struggle with? What tasks do callers want to do in the IVR but aren't supported?
  • What are the barriers to either self-service (when appropriate) or getting to the right person (when self-service isn't an option)?

Many companies have the general idea that their IVR systems don't work well for customers, based on the complaints and other feedback they get. This one-time survey will quantify that, help identify which customers are having what sorts of problems, and point out ways to make it work better. Often small changes, like updating the options in a menu, can have a large effect.

Idea 3: Add some feedback to your training program.

Hearing the voice of the customer can be a powerful training tool, and the most effective way to deliver this is through the literal voice of the customer--that is, the recording of a follow-up interview with the customer.

Many contact centers already do training sessions where supervisors listen to calls with the customer service reps and offer feedback. It's easy to add customer feedback to this process. You will want to call a small sample of customers back right after their call, and ask for general feedback and suggestions. Play the recording of this interview for the CSR when reviewing the same call for training purposes.

Some tips for implementing this:

  1. You can start very small and informal, and scale it up as appropriate. To begin with it may be as informal as having a supervisor call the customer back and ask, "I'm going to be training Alice in a few minutes, do you have any suggestions?" This can grow into a program with specific questions, defined metrics, and a statistical sample.
  2. Tempting as it may be, don't make the CSR part of the call back to the customer. This will just be awkward. Don't try to get feedback during the customer service call, since you'll only get the happy customers. Call back.
  3. Have the CSR listen to the interview recording first, and then listen to the original call. That puts the agent in the shoes of the customer, and makes him or her more sensitive to how the customer viewed the call.

Pretty Good Practice: Survey About the IVR

The overwhelming majority of contact center surveys completely ignore a huge and important group of customers: those customers who never spoke to an agent.

Now, there are good reasons for focusing surveys on agent-handled calls. Often, a key purpose of the survey is measuring and training the agents, and the agent experience is likely to be less consistent than the IVR. And some contact center surveys do include a question or two asking about the automated part of the call.

But what about those customers who:

  • Never needed to speak to an agent because they could self-serve in the IVR
  • Wanted to get to an agent, but couldn't figure out how
  • Gave up waiting to speak to a person
  • Had the IVR hang up on them (yes it happens, and more often than you think)
  • Would have been willing to use the IVR, but found it confusing

Those people are your customers, too, and in many companies there are more of them than the people who actually spoke to a customer service representative.

Chances are, the experience of those customers who never left the IVR is having more of an impact on the customer satisfaction and operational cost of the call center than the customers who spoke to a person.

And most companies are ignoring these customers completely in their survey process.

This is a huge blind spot, and an enormous opportunity to improve the customer experience and save money.

Over Scripting

According to a report from Consumerist, some Best Buy customer service reps are not happy that they are being given new scripts and aren't being allowed to deviate.

Over-scripting is a problem in the customer service business, since there's a tendency for management to think they can anticipate every customer inquiry and write the perfect answer. Not true, and my observation is that most contact centers have been moving to give more flexibility to CSRs, not less.

Over-scripting causes a lot of problems:

  • It makes it much harder to establish a rapport with the customer and understand their problems.
  • It's usually obvious that the CSR is reading from a script, and customers hate it.
  • CSRs don't like being micro-managed about every word which comes out of their mouths.
  • It tends to lead the company to focus more on compliance and less on service.

Over-scripting is a really easy trap to fall into in the survey business. When performing a structured interview it really is important to be completely consistent about every word in the question. Even small word choices can shade the meaning of a question and bias the results.

As a result, many companies which do phone interviews create extremely rigid scripts. I've seen instances where the interviewer is literally not allowed to deviate even one word during the course of the entire survey--even if the participant has questions or problems (responses to the participants' questions are also scripted, of course).

That's wrong. The whole point of doing an interview (as opposed to an automated survey) is to establish a rapport with the customer, go in-depth, and leave the customer feeling that you actually listened. Over-scripting makes it impossible to do these things.

On the other hand, we really do need to have consistent questions in order to get consistent and useful data. It's not helpful to go to the other extreme--like one phone survey I encountered where the only question was, "How was your experience?" with just a single open-ended question. That format might be useful for getting some high-level qualitative feedback for training, but it's not going to yield meaningful statistics for decision-making or tracking.

The key is to strike a balance between being consistent on the survey and letting the interviewer be human. We clearly define which parts of the script need to be read exactly as written, and where an interviewer can go off-script as needed. We recognize that some questions don't need to be strictly scripted (for example, when we want to know the reason behind some answer), and we allow the interviewer to acknowledge that the survey questions are scripted (which the customer knows anyway).

This allows us to have a more meaningful dialogue with customers and leave them feeling that we really listened to them, while still getting consistent metrics for tracking and analysis by our clients.

Issue 70 of Vocalabs' Newsletter

Issue 70 of Vocalabs' newsletter, Quality Times, has been published. E-mail subscribers should be receiving it shortly.

In this issue we announce the Customer Service Survey Maturity Model, as well as an online Maturity Self-Assessment Tool. This will help you understand how mature your ongoing customer service survey efforts are, in the context of the principles of Agile Customer Feedback. The Self-Assessment tool will also offer some suggestions for practices you can adopt to get more useful, actionable customer feedback.

We also discuss collecting customer feedback in a business-to-business context--it's not as different as many people assume.

As always I hope you find this useful and informative. I welcome any comments and suggestions.

The Customer Service Survey Maturity Model

Based on our years of experience with different companies and helping them build effective customer feedback programs, we have developed the Customer Service Survey Maturity Model. Today we published a white paper outlining the model, as well as a simple online self-assessment tool which anyone can use to gain an understanding of where their customer feedback practices stand, using the principles of Agile Customer Feedback as a guide.

The maturity model and the self-assessment tool are a quick way to get a sense of what's possible and what changes to your customer survey process are likely to yield the most bang for your buck. The model evaluates the three functional areas of any customer service survey process: Data Collection, Reporting, and Process. In each area, the company's maturity is evaluated based on which practices the company uses.

Since Data Collection, Reporting, and Process all support each other, the most fertile ground for improvement is usually where the company is least mature.

We're hoping this is a useful tool for our clients and the community in general. I welcome any comments or feedback.

Extreme Reactions

Just how extreme of a reaction can you get from poor customer service?

Snopes just published a story about a bank which lost a million-dollar account and 30-year customer because it refused to validate a $0.50 parking coupon.

The incident happened in the late 1980's (when a million bucks was real money) at the Old National Bank in Spokane, WA (now part of U.S. Bank). John Barrier was in the construction business and was dressed in shabby clothes when he went to the bank to cash a check.

Both the teller and the bank manager, apparently failing to recognize the importance of the customer, refused to validate Barrier's parking coupon to save him the $0.50 it cost to park at the bank. As a result, Barrier closed his account the next day and moved to a different bank down the street.

Why such a move? We can guess it probably wasn't the fifty cents per se. From the quotes John Barrier gave in a newspaper article at the time, it sounds like he felt the bank didn't give him the basic respect he deserved as a customer: speaking of the bank manager, Barrier said, "He looked me up and down and stood back and gave me one of those kinds of looks." He was also quoted as saying, "If you have $1 in a bank or $1 million, I think they owe you the courtesy of stamping your parking ticket."

The lesson from this is that customers will sometimes have an extreme reaction to a single poor experience. If a customer feels he's been disrespected, then it becomes about something much more basic than a financial transaction. It's about dignity, social standing, and respect.

Business-to-Business Customer Feedback

The overwhelming majority of customer service surveys happen in the business-to-consumer world.

Which, if you think about it, is a little strange given that the business case for an effective customer feedback program is much stronger in the business-to-business world. In B2B, the value of each customer is usually higher, the acquisition costs higher, and the length of the relationship longer. Reputation and word-of-mouth is just as important in B2B, but it's less likely to be visible through social media and other public channels.

We have a number of very successful B2B customer feedback programs in place, and our clients have found it to be a powerful tool. Many of the tools and techniques we use in B2C customer feedback are just as useful in B2B, such as real-time interviews, delivering data to the front lines in real-time, and a robust service recovery process.

There are a few things to be aware of, though, when implementing a B2B customer feedback program:

  • Reaching the right person can be more of a challenge than in B2C. It's common to have multiple contacts for each customer and lot of out-of-date information.
  • B2B customers are very willing to talk to an interviewer for a few minutes--refusal rates are typically very low. However, you need to respect their time, and automated surveys are not likely to be successful.
  • B2B customers expect a high level of service, and if they have a complaint they expect it to be addressed. A proper service recovery process is not optional.

So don't listen to the people who think you can't do customer feedback in B2B--our clients know it isn't true.

How Not to Do a Phone Interview

I'm a big believer in using phone interviews to get real-time in-depth feedback on customer service. There's simply no better way to have an actual conversation with a customer, get him or her to open up, and find out how the experience was.

But there's a right way and a wrong way.

Over the weekend I was asked to do a survey about a phone call I made to a nurse line a couple weeks ago. Two weeks ago my wife had a bicycle accident, and I needed help deciding if she needed to go back to the doctor to look at a possibly-infected wound. She's doing much better now, thanks.

The interview was practically a case study in how not to do a customer service survey:

  • The interview took place almost two weeks after the original call. Pretty Good Practice: The faster you can reach the customer the better--calling back within a few minutes is best, though in this particular situation waiting several hours may be more appropriate. People's memories start to degrade very quickly, and after two weeks I remember very few details of my call other than the fact that it took place. That means that my answers are not going to be very detailed or specific, and I won't be able to give meaningful answers to a lot of questions.
  • At no time did the interviewer tell me how long the survey would take. Pretty Good Practice: Always inform customers how long the survey will take before asking if they'll participate. Seriously. They're doing you a favor, the least you can do is have a little respect.
  • The survey was 14 minutes long (according to the timer on my phone). Pretty Good Practice: Keep phone interviews under five minutes if at all possible. Fourteen minutes is way too long, and likely indicates the people designing the survey didn't have the discipline to cut questions which aren't useful or necessary. On the other hand, this explains why the interviewer didn't tell me how long it would take. Again, have some respect for customers and their time.
  • Many of the questions were slight variations of each other. Pretty Good Practice: Make sure different questions are really asking different things. This is where forcing yourself to write a five-minute interview script is a useful discipline. It quickly becomes obvious that you don't need to have a question about whether the nurse was "caring" and then another question about whether she was "helpful." Pick the two or three things which are most important and leave the rest off.
  • Some of the questions were confusing or didn't apply to the situation. The interviewer was not allowed to do anything but repeat the question. Pretty Good Practice: Adjust questions to make sure they make sense, and give the interviewer some latitude to help the customer understand the question. We all write bad survey questions from time to time, I get that, and you don't want interviewers going off the rails and re-interpreting the survey. But don't sacrifice accuracy on the altar of consistency, either. One of the great advantages of an interview over an automated survey is the interviewer can help a confused customer understand the intent of the question. Here's my slightly embellished recollection of how one of the questions on my interview went:

    Interviewer: How satisfied are you with the wait?
    Me: What do you mean by the wait? The wait to speak to a person?
    Interviewer: I'm only allowed to read the question, you can interpret it however you want.

    Seriously? Would it have killed them to let the interviewer tell me that the question was about how long it took for someone to answer my call? Knowing what I do about call centers and the likely purpose of the survey, I could infer that this was what the question was about. But someone else might have interpreted it to be about the wait at a clinic or something completely different, and that data would be useless. And fix the question, while you're at it.

  • There was hardly any free response on the survey. Pretty Good Practice: On a phone interview, have plenty of follow-up questions and opportunities for customers to give more detail. Another great advantage of the interview format is you can do this. Especially on such a long survey, not taking advantage of the customer's willingness to go beyond the scripted questions is a terrible waste of everyone's time.

  • The interviewer sounded bored and monotonic. Pretty Good Practice: Phone interviewers should sound engaged and try to establish a rapport with customers. I seriously did start to wonder if the interviewer was a recording, so every now and then I would toss in a question of my own to make sure I was still talking to a person. A shorter, better-designed script would have helped a lot. It also helps to hire interviewers who are naturally engaging. Who knows--this interviewer may actually sparkle most of the time, she's just been beat into submission by the awful dullness of it all.

  • Use of "Which of these statements best fits..." type questions on a phone interview. Pretty Good Practice: Don't use these on phone interviews. Really. Just don't do it. Any time I see a question like this on an interview script, I know it was written by someone who never had to actually do an interview. These questions are suitable for written surveys, where the participant can read the choices, quickly re-read the most relevant ones, and doesn't have to remember all the options. In a phone interview, it takes a long time to read the choices, and the participant can only remember so much at once. If the customer has forgotten one of the choices, the interviewer will need to re-read the option, which takes even longer. Find some other way to get the data you're looking for.

  • Asking demographic questions on a customer service survey. Pretty Good Practice: Get the demographic data somewhere else, or do without. Some researchers love their demographic questions, but experience has shown that they rarely lead to insights on a customer service survey. Surveys for other purposes are a different matter, but if you're trying to learn about a specific customer experience the demographics of the customer are usually less important than other factors like who the customer spoke to, the reason for the call, etc. There are many reasons not to ask for demographics: it takes time (in what should be a short survey), customers often find the questions intrusive, and there's a decent chance you can get the same information elsewhere.

Vocalabs Newsletter #69 is Published

Issue 69 of Quality Times, Vocalabs' newsletter on customer service surveys, has been published. This is the Compliance Issue, with two articles on upcoming regulatory changes relevant to companies collecting customer feedback. One set of changes is the new FCC regulations about robocalling, which will be relevant to anyone performing outbound IVR surveys. The other is updates to HIPAA security and privacy rules, which will impact anyone collecting customer feedback in the healthcare arena.

As always, I hope you find this interesting and informative. You can subscribe to receive newsletters via e-mail using the form on the newsletter page.

Warranty service on an Epson projector

Over the weekend my home theater projector, an Epson 8350, died. It was about 18 months old, and didn't last nearly as long as I expected. But it was a good projector, so I wasn't too upset.

When I took the projector down, I noticed a sticker with a phone number for technical support. In this day and age, I've pretty much resigned myself to the fact that almost any electronic item more than a couple months old which breaks is probably just going to get thrown away and replaced.

But on a whim I called the support number on the projector. I reached a technician with flawless English (other than an odd habit of addressing me as Sir Peter) who asked a few troubleshooting questions and took my serial number.

Then to my astonishment he told me my projector is still under warranty, and he would FedEx a replacement unit. I could send the broken one back in the same shipping box, with shipping prepaid.

So today I have a new (refurbished) projector, and a new appreciation for Epson. Thank you for standing behind your products.

More than a metaphor?

"Brand terrorist" is a phrase you sometimes hear in the customer experience world. It describes a customer who dislikes a company and actively spreads bad word-of-mouth.

I've always felt this metaphor is way over-the-top. "Terrorist" is far too extreme a word to describe someone who merely says mean things about you. Maybe others feel the same, which would epxlain why it hasn't really caught on.

On the other hand....When someone smashes a $400,000 sports car with a sledgehammer in public in order to complain about bad service, "brand terrorist" starts to feel about right.

Spring!

In the Twin Cities, we finally just got to the end of a long, cold, drawn-out late winter/early spring. Two weeks ago, there was a foot of fresh snow on the ground just south of here.

Yesterday it topped 100 degrees in a few places.

Today (and the next several days) are supposed to be perfect: sunny, mid-70's, and low humidity. The kind of day that makes you wish the office building had a retractable roof.

Here's my Big Idea for the day. When the weather is really bad--around here that means well over a foot of snow--we get a "snow day." I'm told that in some other parts of the country they get "hurricane days," but we don't have those here. Having that impromptu vacation should be fun (just ask the kids), but the problem is that as adults we can't enjoy it.

So on the first really nice day of Spring here, we should cancel school and work. Everyone could go outside and enjoy the fresh air and sunshine, get a month's worth of vitamin D, and get that day off when you can really take advantage of it.

So whaddaya think?

FCC Rule Changes and Outbound IVR Surveys

The FCC has made some rule changes which go into effect October 2013 which add new restrictions on the use of outbound IVR (aka "robocalling"). Some of these changes will affect outbound IVR surveys, so if you use this technique you should be aware of what's going on.

I'm not a lawyer, but I've been researching these changes. My summary of how this impacts outbound IVR surveys may be helpful to get the lay of the land, but if you think this may affect your business you should get legal advice from an actual lawyer.

The background is simply that consumers hate robocalls, and they have been complaining to the FCC. A lot. And when you annoy enough voters, politicians and bureaucrats tend to notice.

The old rules had some significant loopholes (aka "safe harbors") which made it fairly easy for companies to legally robocall consumers. The biggest loophole was the "established business relationship" exemption, which basically said that the rules didn't apply if there was an established business relationship. That is now gone. There is also a new, and very strict, definition of the kind of consent you need for "telemarketing" calls made using an autodialer or outbound IVR.

Under the new rules, you need:

  • Express written consent from the recipient before a telemarketing call using a robocall or autodialer. Express Written Consent has a specific definition, and is a real hurdle: basically the consumer has to sign something in writing specifically authorizing robocalls, in a way which makes it clear that's what the consumer meant to do.
  • Express consent from the recipient before making any robocall or autodialed call to an mobile phone. Express Consent isn't specifically defined in the rules, but the implication (both in the rules and in the discussion in the FCC's report) seems to be that it's supposed to be just as unambiguous to the consumer as Express Written Consent, but you can be a little more flexible about how you get it--for example, by asking over the phone. But, as I interpret the rules, you can't bury something on page 19 of your Terms of Service which would give consent for the customer to be robocalled (both because it's not prominent enough, and also because you can't make Express Consent a requirement for selling the customer any product or service).

In addition, all robocalls must identify the company placing the call right at the beginning of the call (using the company's legal name) and provide a phone number during the call.

The implications for outbound IVR surveys, as I read the rules, are as follows:

  • An outbound IVR survey to a business or landline doesn't need prior consent from the person you're calling.
  • An outbound IVR survey to a mobile phone requires express consent, even if there's an established business relationship.
  • You can be flexible about how you get consent for an IVR survey, as long as it's clear to the consumer that they're agreeing to be robocalled and you keep a record. For example, any of these would be OK:
    • A recording of the customer service rep asking the customer, "Can we have your permission to call you for an automated survey after this call?"
    • A recording of your (inbound) IVR asking the customer, "We would like to call you after this call for an automated survey. Please press one if we have your permission," followed by the DTMF digit "one" from the customer.
    • A web form where the customer checked the box next to the statement, "You may call me with an automated customer survey."
    • A written contract where the customer initialed a box next to the statement, "You may call me with an automated customer survey," as long as the customer's consent was optional.
  • Any outbound IVR survey must begin with a statement like, "We would like you to take this automated survey from XYZ Incorporated," and at some point have a statement like, "You can reach XYZ Incorporated at 800-555-5555."
  • You need to be very careful that your survey doesn't include a marketing message and therefore become "telemarketing." What's a marketing message? I don't know. If you give the customer a coupon code for taking the survey, to you that might just be a way of boosting response. But to the FCC, distributing coupons might be "marketing." I won't be the one to litigate this question.

On the whole, these rule changes will not make outbound IVR surveys impossible for companies trying to do post-call customer surveys. But they do impose some significant headaches. No longer can you hide behind the "established business relationship" loophole. You need to get specific permission from the customer to robocall him or her. Just as importantly, you need to have a record of that permission, because if there's a dispute the burden of proof is on the company to show there was express consent.

The easiest thing to do is have the CSR ask for permission for the survey. Since many companies already record calls, that takes care of the record keeping problem. But this is bad survey practice, since it gives the agent a chance to manipulate the process.

Better is to have the (inbound) IVR offer the survey. But since almost no companies currently record IVR interactions or keep their log files for an extended period of time, it may be a significant burden to maintain the proof that the customer really did consent to the survey.

Another option is to ask customers for blanket permission to do outbound IVR surveys. To make this work, there will have to be a database entry (probably in the CRM system) showing which customers consented, and IVR calls can only be placed to customers who agreed. Depending on your infrastructure, this could be simple, or it could be very complex to make it work in real-time.

The one thing which is clear (and again, please get legal advice!) is that the days of "anything goes" outbound IVR surveys are gone for good.

Customers Notice Improvement

About 18 months ago we implemented a rigorous customer feedback program for one of our clients. This is a support desk for an enterprise software company, so the client is dealing with frustrated customers who spend a lot of money.

Before our arrival, the client's feedback program was very primitive. They used SurveyMonkey to send surveys to customers after a trouble ticket was closed (if someone remembered), and had a poor response rate with feedback that wasn't very actionable.

We redesigned the process to use an immediate phone interview, integrated into the ticketing system on Salesforce.com. Our interviewers asked about several elements of the support process, with a lot of follow-up questions. We evolved the survey questions as we learned more about the customers' hot-button issues and helped the client narrow down on the root causes of customer complaints.

Just as important, we deliver the feedback directly to the support team in real-time, so they will see a customer's feedback in as little as a few minutes after closing the ticket. They can listen to the interview recording, analyze their statistics by product, problem type, and other data from Salesforce.

Based on the customer feedback the client made some process changes, and also used the data to hold one of its business partners accountable for some of the partner's problems which were showing up in the feedback.

We know all this is working, as the client's survey scores are improving (and their business is growing). But it was especially gratifying when one of the customers made a point of telling our interviewer yesterday that he noticed that the service has "improved a lot," the support engineer was "phenomenal," and that he has noticed a clear change over the past several incidents.

Customers do notice when things improve.

Getting rid of the toll-free number?

According to the LA Times, Spirit Airlines, the aggressively customer-unfriendly purveyor of cheap airfare and expensive extras, has gotten rid of its toll-free customer service numbers. Customers will now have to pay long-distance rates to Salt Lake City.

I'm not a fan of Spirit's pricing tactics. The company goes so far as to quote airfare and fuel separately, as though fuel was an optional extra and you could shop around for a better price. And according to Spirit's data, the company collects an average of $54 per flight in "optional" extras (like $35-$50 for a carry-on bag). Add $108 to a round-trip ticket, and Spirit doesn't seem like such a bargain anymore.

That said, there is some real logic to this. We no longer live in an era of $0.25/minute long distance, and many consumers get unlimited long distance bundled with their mobile phone or landline service. So why should big companies continue to pick up the tab?

On the other hand, Spirit probably pays next to nothing for long distance anyway, so why not? That's what makes me suspect this move by Spirit is more marketing artifice than actual cost savings. Like the "warehouse" grocery store which packs pallets to the ceiling to make you think you're getting a great deal, Spirit seems to go out of its way to create the illusion that its prices are lower than they actually are. Part of that illusion involves stripping out every possible customer convenience. There will always be some customers willing to do almost anything in the name of a deal, even if that deal turns out not to be such a big deal after all.

Newsletter #68 is published

Issue 68 of Vocalabs' newsletter, Quality Times, has been published and is going out to e-mail subscribers now.

In this issue we discuss our just-published white paper: Agile Customer Feedback: Pretty Good Practices. We also muse a little on the changing role of the CSR.

As always, I hope you find this interesting and informative.

Give Us Five Stars for Discounted Pizza

Consumerist has the latest example in a long string of companies bribing customers for good survey scores. A Pizza Hut franchise taped a note to a pizza box offering customers a $1 discount if they bring in proof that they took the survey, and noting that "Only 5's count."

[To be fair to the Pizza Hut--it's not clear that customers would only get the discount for giving the restaurant perfect scores. But the flier could certainly be interpreted that way, and I'm sure many customers did.]

This kind of thing will continue to happen as long as companies keep setting up incentives to improve survey scores without also having strong controls on the survey process. There are a zillion ways to manipulate customer surveys, from blatant (telling customers that "only 5's count") to subtle (selectively reminding the happy customers to take the survey).

Paying employees to deliver better survey scores is almost always going to give better surveys scores--but it won't necessarily give happier customers.

Being Virtual

Last night I got around a foot of snow at my home. Unusual for this late is the season, but not impossible for Minnesota. School started two hours late (what, you think we'd cancel school over a mere 12" of snow?), so I worked from home for part of the morning.

And that's about all that happened.

At Vocalabs, we don't have a physical call center for our phone interviews. Instead, we contract with interviewers all around the country. This lets us work with some of the best interviewers around and be extremely flexible when our clients' needs change.

Just as important, though, it means that days like today are pretty much non-events. Our clients count on us to be collecting customer feedback every day, so they can use it for training, setting goals, and managing their customer experience.

A day like today would have been a mess for any company relying on a bricks-and-mortal call center. People would show up late (or not at all), schedules would be out of whack, client projects would be set behind, and we'd likely be pulling out a big 3-ring binder labeled "Snow Day Contingency Plan."

Instead....ho-hum. As long as the data center is physically intact with power and connectivity, we can keep doing interviews. Since most of the staff can also work remotely, about the only difference is it takes longer to get the mail (postal that is).

I really have nothing more profound to offer, just an observation of how the world has changed in the last 10-15 years.

The Assembled Grill

Assembling a new gas grill can be a pain, and for many customers, it's a definite plus when the store will assemble it for you. So many stores are now offering free assembly as a way to sell more gas grills and provide a better customer experience.

But free assembly isn't what every customer wants. The problem with gas grills in particular is that once assembled, the grill is a lot bigger than it was in the box. So some customers quite reasonably prefer to take the grill home unassembled and put it together themselves, because they couldn't fit the assembled product in their car.

Unfortunately, that's not how some stores view things. As several articles from the Consumerist from 20132012, and 2011 attest, shoppers at Sears, Home Depot, and Sears have all encountered situations where they needed to buy a grill still in the box, but every single grill in the store's inventory had already been assembled. The store had no unassembled grills to sell.

In this situation, what was supposed to be a benefit to the customer can be transformed into a major headache. Worse, in the case of the Home Depot customer, the customer feels like the store is making a sneaky ploy to push truck rental.

These stories illustrate the pitfall of assuming that every customer wants the same thing. The three stores all assumed that every customer would prefer their new grill to be fully assembled, and didn't anticipate that some customers would have different needs. They had what they thought was the right impulse--better customer service!--but managed to turn it into an expensive liability.

The lesson should be obvious: different customers have different needs, and the customer experience which delights one customer may be the same experience which annoys another. Your customer experience has to serve different populations of customers, pay attention to what each customer needs, and have the flexibility to adjust when customers require something unexpected.

Packing my bags for Mobile Voice

I'll be presenting at the Mobile Voice conference next week, as I have done for the past several years. If you're planning to attend (or just in the San Francisco area) please stop by and say Hi!

HIPAA Compliance and Customer Feedback

Vocalabs has several healthcare-related clients, so we are used to dealing with the privacy and security requirements of HIPAA. Some recent changes to the regulations will mean significant new requirements for what a company like us needs to do to remain HIPAA compliant after September 2013.

Since Vocalabs itself is not a healthcare company, we are not what's called a "covered entity" under the regulations. Rather, we are a "business associate," which is basically any company which a covered entity hires to perform some work which may require sharing protected health information.

Many non-healthcare companies hired by a covered entity would also be considered business associates--for example: accountants, IT services, lawyers, business process consultants, etc.

Under the old rules, a business associate had to sign a contract with the covered entity that basically promised to keep protected health information private and secure. Business associates had to maintain the same level of privacy and security as the covered entity, but did not have to go through the formal documentation and review process.

After September, though, business associates have to follow all the security rules as a covered entity (at least insofar as they can reasonably be applied) and produce the same formal documentation and policies. What's more, to the extent that a business associate subcontracts to a third party which may also receive protected health information, that subcontractor also has to comply with all the policy and documentation requirements.

These new requirements can potentially be a big problem for some survey companies. At Vocalabs, our existing policies and processes are already consistent with HIPAA requirements, so for us it will be mostly a matter of documenting and formalizing what we already do. But at companies which aren't as security-minded, the HIPAA changes could require large investments in infrastructure, training, and compliance.

So how does all this apply to Customer Feedback?

Keep in mind that the HIPAA rules only apply to "protected health information," which has a very specific legal definition. It's basically health- and care-related information created by a healthcare company (doctor, hospital, insurance company, etc.) which can be tied to a specific, identifiable patient. Customer feedback is not, by itself, protected health information.

But sometimes we need to have protected health information in order to gather useful feedback. For example, we need to know the patient's phone number to call him or her, and that phone number combined with the fact that there had been a hospital visit could arguably qualify as "protected health information." So to be on the safe side, we will treat it as PHI. For analysis purposes, we may also want to know the doctor's name, hospital, or other details which can really help understand how to improve the patient's experience but which clearly need to be protected.

So between now and September we will be updating our security and privacy policies, revising contracts, and doing everything we need to do to remain fully HIPAA compliant under the new rules. And anyone else collecting customer feedback around healthcare will need to do the same.

Some. Passengers. We. Just. Can't. Move.

Airlines have inflicted so many annoyances on their customers--intentionally or through incompetence--that it's almost refreshing to read about a situation which wasn't actually caused by the airline.

The story is that a passenger flying business-class internationally with his wife on United Airlines did everything right to make sure he got the seats he wanted: he booked well in advance, got seats together, checked in early, and had super-elite status on United.

As they were boarding, though, the wife was pulled aside and given a new seat assignment so the couple would no longer be seated together. Naturally annoyed, the customer asked for an explanation and was only told that "there are some passengers we can't move."

The flight attendants also seemed confused by the situation, but wouldn't provide any explanation other than "Some passengers we just can't move."

Eventually the passenger took the hint and figured out that (spoiler alert!) the wife's seat had been claimed by an air marshal, and of course the crew isn't supposed to reveal the presence of an air marshal on board. And the couple did manage to arrange a seat swap so they could sit together.

But even though the airline didn't create the situation and wasn't allowed to explain it, it seems that pulling the wife out of the boarding line was about the worst possible way to handle this. At a minimum the gate agent could have paged them before boarding and--without explaining the reason--told them that it was necessary to reseat the wife. Even better, the gate agent could have made some effort to arrange it so they could still sit together.

The lesson, I think, is that even when a company is placed in an unusual and difficult position, there is still a choice about how you want to treat your customers.

We Can Record You, but You Can't Record Us

Via Techdirt today, a mildly amusing recording of a customer taunting a Time Warner Cable CSR by saying he's recording the call.

In the recording the customer begins by telling the CSR that he's recording. The CSR, no doubt following TWC's written policies, says he doesn't consent to the recording. The customer asks how that can be given that TWC is itself recording the call.

Unfortunately the CSR is caught in the middle--as everyone (except maybe TWC's lawyers) understands, the policy is absurd. But the CSR isn't allowed to deviate, and can't think of a rational reason why the customer shouldn't record the call, and there you go.

What this really points out, though, is the sheer nuttiness of these "We will record you, but not give permission for you to record us" policies. Anecdotally, I know that many large companies have these policies. My guess is that the underlying reason, more than anything else, is a vague discomfort with the general idea of being recorded without permission (dressed up in language about "respecting the CSR's privacy" and/or "protecting us from liability").

But let's consider just how many different kinds of crazy this policy is:

  1. It treats the customer as implicitly untrustworthy, and not deserving of the same rights the company claims for itself.
  2. Withholding consent probably has no legal effect. Most states permit people to record phone calls without the consent of the other party; and even in states which require consent of both parties, the company has arguably consented to recording by collecting its own recording.
  3. It makes it seem that the company has something to hide.
  4. The only real downside to allowing the customer to record the call is that the company's incompetence (or even misconduct) might be exposed. See #3 above.
  5. It implicitly assumes that companies have a greater right to privacy than consumers. Most people assume the opposite should be true.

So what should a CSR do when a customer says the call is being recorded?

How about this: "Very good, and thanks for letting me know. How can I help you today?"

Syndicate content