The Customer Service Survey

Vocalabs' Blog


Customers Notice Improvement

About 18 months ago we implemented a rigorous customer feedback program for one of our clients. This is a support desk for an enterprise software company, so the client is dealing with frustrated customers who spend a lot of money.

Before our arrival, the client's feedback program was very primitive. They used SurveyMonkey to send surveys to customers after a trouble ticket was closed (if someone remembered), and had a poor response rate with feedback that wasn't very actionable.

We redesigned the process to use an immediate phone interview, integrated into the ticketing system on Our interviewers asked about several elements of the support process, with a lot of follow-up questions. We evolved the survey questions as we learned more about the customers' hot-button issues and helped the client narrow down on the root causes of customer complaints.

Just as important, we deliver the feedback directly to the support team in real-time, so they will see a customer's feedback in as little as a few minutes after closing the ticket. They can listen to the interview recording, analyze their statistics by product, problem type, and other data from Salesforce.

Based on the customer feedback the client made some process changes, and also used the data to hold one of its business partners accountable for some of the partner's problems which were showing up in the feedback.

We know all this is working, as the client's survey scores are improving (and their business is growing). But it was especially gratifying when one of the customers made a point of telling our interviewer yesterday that he noticed that the service has "improved a lot," the support engineer was "phenomenal," and that he has noticed a clear change over the past several incidents.

Customers do notice when things improve.

Getting rid of the toll-free number?

According to the LA Times, Spirit Airlines, the aggressively customer-unfriendly purveyor of cheap airfare and expensive extras, has gotten rid of its toll-free customer service numbers. Customers will now have to pay long-distance rates to Salt Lake City.

I'm not a fan of Spirit's pricing tactics. The company goes so far as to quote airfare and fuel separately, as though fuel was an optional extra and you could shop around for a better price. And according to Spirit's data, the company collects an average of $54 per flight in "optional" extras (like $35-$50 for a carry-on bag). Add $108 to a round-trip ticket, and Spirit doesn't seem like such a bargain anymore.

That said, there is some real logic to this. We no longer live in an era of $0.25/minute long distance, and many consumers get unlimited long distance bundled with their mobile phone or landline service. So why should big companies continue to pick up the tab?

On the other hand, Spirit probably pays next to nothing for long distance anyway, so why not? That's what makes me suspect this move by Spirit is more marketing artifice than actual cost savings. Like the "warehouse" grocery store which packs pallets to the ceiling to make you think you're getting a great deal, Spirit seems to go out of its way to create the illusion that its prices are lower than they actually are. Part of that illusion involves stripping out every possible customer convenience. There will always be some customers willing to do almost anything in the name of a deal, even if that deal turns out not to be such a big deal after all.

Newsletter #68 is published

Issue 68 of Vocalabs' newsletter, Quality Times, has been published and is going out to e-mail subscribers now.

In this issue we discuss our just-published white paper: Agile Customer Feedback: Pretty Good Practices. We also muse a little on the changing role of the CSR.

As always, I hope you find this interesting and informative.

Give Us Five Stars for Discounted Pizza

Consumerist has the latest example in a long string of companies bribing customers for good survey scores. A Pizza Hut franchise taped a note to a pizza box offering customers a $1 discount if they bring in proof that they took the survey, and noting that "Only 5's count."

[To be fair to the Pizza Hut--it's not clear that customers would only get the discount for giving the restaurant perfect scores. But the flier could certainly be interpreted that way, and I'm sure many customers did.]

This kind of thing will continue to happen as long as companies keep setting up incentives to improve survey scores without also having strong controls on the survey process. There are a zillion ways to manipulate customer surveys, from blatant (telling customers that "only 5's count") to subtle (selectively reminding the happy customers to take the survey).

Paying employees to deliver better survey scores is almost always going to give better surveys scores--but it won't necessarily give happier customers.

Being Virtual

Last night I got around a foot of snow at my home. Unusual for this late is the season, but not impossible for Minnesota. School started two hours late (what, you think we'd cancel school over a mere 12" of snow?), so I worked from home for part of the morning.

And that's about all that happened.

At Vocalabs, we don't have a physical call center for our phone interviews. Instead, we contract with interviewers all around the country. This lets us work with some of the best interviewers around and be extremely flexible when our clients' needs change.

Just as important, though, it means that days like today are pretty much non-events. Our clients count on us to be collecting customer feedback every day, so they can use it for training, setting goals, and managing their customer experience.

A day like today would have been a mess for any company relying on a bricks-and-mortal call center. People would show up late (or not at all), schedules would be out of whack, client projects would be set behind, and we'd likely be pulling out a big 3-ring binder labeled "Snow Day Contingency Plan."

Instead....ho-hum. As long as the data center is physically intact with power and connectivity, we can keep doing interviews. Since most of the staff can also work remotely, about the only difference is it takes longer to get the mail (postal that is).

I really have nothing more profound to offer, just an observation of how the world has changed in the last 10-15 years.

The Assembled Grill

Assembling a new gas grill can be a pain, and for many customers, it's a definite plus when the store will assemble it for you. So many stores are now offering free assembly as a way to sell more gas grills and provide a better customer experience.

But free assembly isn't what every customer wants. The problem with gas grills in particular is that once assembled, the grill is a lot bigger than it was in the box. So some customers quite reasonably prefer to take the grill home unassembled and put it together themselves, because they couldn't fit the assembled product in their car.

Unfortunately, that's not how some stores view things. As several articles from the Consumerist from 20132012, and 2011 attest, shoppers at Sears, Home Depot, and Sears have all encountered situations where they needed to buy a grill still in the box, but every single grill in the store's inventory had already been assembled. The store had no unassembled grills to sell.

In this situation, what was supposed to be a benefit to the customer can be transformed into a major headache. Worse, in the case of the Home Depot customer, the customer feels like the store is making a sneaky ploy to push truck rental.

These stories illustrate the pitfall of assuming that every customer wants the same thing. The three stores all assumed that every customer would prefer their new grill to be fully assembled, and didn't anticipate that some customers would have different needs. They had what they thought was the right impulse--better customer service!--but managed to turn it into an expensive liability.

The lesson should be obvious: different customers have different needs, and the customer experience which delights one customer may be the same experience which annoys another. Your customer experience has to serve different populations of customers, pay attention to what each customer needs, and have the flexibility to adjust when customers require something unexpected.

Packing my bags for Mobile Voice

I'll be presenting at the Mobile Voice conference next week, as I have done for the past several years. If you're planning to attend (or just in the San Francisco area) please stop by and say Hi!

HIPAA Compliance and Customer Feedback

Vocalabs has several healthcare-related clients, so we are used to dealing with the privacy and security requirements of HIPAA. Some recent changes to the regulations will mean significant new requirements for what a company like us needs to do to remain HIPAA compliant after September 2013.

Since Vocalabs itself is not a healthcare company, we are not what's called a "covered entity" under the regulations. Rather, we are a "business associate," which is basically any company which a covered entity hires to perform some work which may require sharing protected health information.

Many non-healthcare companies hired by a covered entity would also be considered business associates--for example: accountants, IT services, lawyers, business process consultants, etc.

Under the old rules, a business associate had to sign a contract with the covered entity that basically promised to keep protected health information private and secure. Business associates had to maintain the same level of privacy and security as the covered entity, but did not have to go through the formal documentation and review process.

After September, though, business associates have to follow all the security rules as a covered entity (at least insofar as they can reasonably be applied) and produce the same formal documentation and policies. What's more, to the extent that a business associate subcontracts to a third party which may also receive protected health information, that subcontractor also has to comply with all the policy and documentation requirements.

These new requirements can potentially be a big problem for some survey companies. At Vocalabs, our existing policies and processes are already consistent with HIPAA requirements, so for us it will be mostly a matter of documenting and formalizing what we already do. But at companies which aren't as security-minded, the HIPAA changes could require large investments in infrastructure, training, and compliance.

So how does all this apply to Customer Feedback?

Keep in mind that the HIPAA rules only apply to "protected health information," which has a very specific legal definition. It's basically health- and care-related information created by a healthcare company (doctor, hospital, insurance company, etc.) which can be tied to a specific, identifiable patient. Customer feedback is not, by itself, protected health information.

But sometimes we need to have protected health information in order to gather useful feedback. For example, we need to know the patient's phone number to call him or her, and that phone number combined with the fact that there had been a hospital visit could arguably qualify as "protected health information." So to be on the safe side, we will treat it as PHI. For analysis purposes, we may also want to know the doctor's name, hospital, or other details which can really help understand how to improve the patient's experience but which clearly need to be protected.

So between now and September we will be updating our security and privacy policies, revising contracts, and doing everything we need to do to remain fully HIPAA compliant under the new rules. And anyone else collecting customer feedback around healthcare will need to do the same.

Some. Passengers. We. Just. Can't. Move.

Airlines have inflicted so many annoyances on their customers--intentionally or through incompetence--that it's almost refreshing to read about a situation which wasn't actually caused by the airline.

The story is that a passenger flying business-class internationally with his wife on United Airlines did everything right to make sure he got the seats he wanted: he booked well in advance, got seats together, checked in early, and had super-elite status on United.

As they were boarding, though, the wife was pulled aside and given a new seat assignment so the couple would no longer be seated together. Naturally annoyed, the customer asked for an explanation and was only told that "there are some passengers we can't move."

The flight attendants also seemed confused by the situation, but wouldn't provide any explanation other than "Some passengers we just can't move."

Eventually the passenger took the hint and figured out that (spoiler alert!) the wife's seat had been claimed by an air marshal, and of course the crew isn't supposed to reveal the presence of an air marshal on board. And the couple did manage to arrange a seat swap so they could sit together.

But even though the airline didn't create the situation and wasn't allowed to explain it, it seems that pulling the wife out of the boarding line was about the worst possible way to handle this. At a minimum the gate agent could have paged them before boarding and--without explaining the reason--told them that it was necessary to reseat the wife. Even better, the gate agent could have made some effort to arrange it so they could still sit together.

The lesson, I think, is that even when a company is placed in an unusual and difficult position, there is still a choice about how you want to treat your customers.

We Can Record You, but You Can't Record Us

Via Techdirt today, a mildly amusing recording of a customer taunting a Time Warner Cable CSR by saying he's recording the call.

In the recording the customer begins by telling the CSR that he's recording. The CSR, no doubt following TWC's written policies, says he doesn't consent to the recording. The customer asks how that can be given that TWC is itself recording the call.

Unfortunately the CSR is caught in the middle--as everyone (except maybe TWC's lawyers) understands, the policy is absurd. But the CSR isn't allowed to deviate, and can't think of a rational reason why the customer shouldn't record the call, and there you go.

What this really points out, though, is the sheer nuttiness of these "We will record you, but not give permission for you to record us" policies. Anecdotally, I know that many large companies have these policies. My guess is that the underlying reason, more than anything else, is a vague discomfort with the general idea of being recorded without permission (dressed up in language about "respecting the CSR's privacy" and/or "protecting us from liability").

But let's consider just how many different kinds of crazy this policy is:

  1. It treats the customer as implicitly untrustworthy, and not deserving of the same rights the company claims for itself.
  2. Withholding consent probably has no legal effect. Most states permit people to record phone calls without the consent of the other party; and even in states which require consent of both parties, the company has arguably consented to recording by collecting its own recording.
  3. It makes it seem that the company has something to hide.
  4. The only real downside to allowing the customer to record the call is that the company's incompetence (or even misconduct) might be exposed. See #3 above.
  5. It implicitly assumes that companies have a greater right to privacy than consumers. Most people assume the opposite should be true.

So what should a CSR do when a customer says the call is being recorded?

How about this: "Very good, and thanks for letting me know. How can I help you today?"

Vocalabs Newsletter #67 Published

Issue 67 of Vocalabs' NewsletterQuality Times, has been published. In this issue we discuss a couple of topics around closed-loop customer feedback: how the survey itself needs to be part of the closed loop, and how the full process is more important than the metrics.

As always, I hope you find this interesting and informative. E-mail subscribers should be receiving their copies shortly.

Making Surveys Predictive

There's a simple but powerful technique I think should be part of every customer survey to make it much more valuable for business decisions: every customer survey should be linked to a record of the customer's buying behavior.

Most companies already have this data available, and some companies are making a significant effort on "big data" analysis projects to try to tease out what it all means.

Taking the small extra step of including this data in the customer survey report makes use of the fact that, if you have a customer survey, your customers are already telling you how they feel about you. In many ways that's a lot easier than hunting for subtle statistical clues in a tsunami of behavior.

For example, one of our clients found that, compared to "Very Satisfied" customers, customers who were "Somewhat Satisfied" or worse with a customer service call were about 4x more likely to take their business elsewhere within the next six months.

That's not a small difference. Those customers are telling you directly that they are not loyal. Chances are, if you dig even a little you will find that they also told you (directly, in response to your survey question) why.

Are you listening?

Closing the Loop in Closed Loop Customer Feedback

Closed Loop is a popular buzzword for customer feedback programs today. It means implementing a formal process whereby customer feedback is used to drive change in the company in order to improve customer feedback.

This kind of cycle is a powerful tool--in fact, I would argue that it's pretty much the only way to build an effective and useful customer feedback process. After all, if the customer feedback isn't being used to drive change in the company, then what's it good for?

But most advocates of a closed loop process aren't really closing the loop. That's because most supposedly closed loop customer feedback processes don't formally consider the customer survey as part of the "loop" to be "closed." In other words, the company never reconsiders whether the survey is asking the right questions of the right people. It is just assumed that the survey should keep doing the same thing no matter what.

In fact, some of the proponents of closed loop processes actually advocate for very rigid and inflexible customer surveys: specific questions, with very particular sampling methods, applied in exactly the same way for every company.

That kind of rigidity (proponents would call it "consistency") is important for cross-company and cross-industry benchmarking, but it's not very helpful when trying to improve a particular customer experience at a particular company. It even sounds a little crazy to suggest that the customer survey which will yield useful, actionable feedback from a customer logging on to a bank website is the exact same customer survey which gives useful, actionable feedback from a customer buying a used car.

Instead, the customer survey itself needs to be part of the closed loop process. As companies examine their feedback they should constantly be asking questions like:

  • Are we asking the right questions?
  • Are we targeting the right customers for the survey?
  • Is this survey yielding useful data?
  • Are there issues we should be exploring in more depth on the survey?
  • Are the right people getting the right data from the survey to make the best use of it?

The questions to ask, customers to talk to, and how to use the survey data will all change constantly as the company and its business environment evolves. The survey which works today isn't likely to be optimal a year from now, and five years from now it may seem close to irrelevant.

So when you build your closed loop customer feedback process, remember that the customer feedback also needs to change with the company.

Can't Buy me Like

Bob Garfield, NPR media guru, has a new book out called "Can't Buy me Like." I haven't read it, but my guess is it's worth $12.99 just for the title alone.

Rather than plug the book, though, I'm going to plug the 25-minute segment he did yesterday for Talk of the Nation. It doesn't break any new ground (at least for those of us who have been around long enough to remember the Cluetrain Manifesto), but he makes an excellent argument for why companies need to focus more on providing genuine customer experiences and less on mass media advertising.

Plus, you can listen to the interview for free.

Metrics are Less Important than Process

I see a lot of people spend a lot of time and effort making sure they focus on the "right" metric in their customer feedback program. Net promoter? Customer Satisfaction? Customer Effort? Something else? All of the above?

There's often a lot less thought going into what to do with the survey data: who should get it, how often, how will it be used, how will people be coached and compensated, how will you follow up with customers, and so forth.

This is a big mistake, since most of the value of a successful customer feedback program comes from all the things that happen after the data is collected. In my not-very-scientific estimation, I would say that 75% of the value is in the process, with only 25% from the metrics.

I could be wrong, though: I would believe that as much as 90% of the value is in the process.

Even a mediocre metric is going to allow a lot of improvement in a well-designed closed-loop feedback process. But no metric, no matter how good, is going to drive change if there's no follow-through.

So why so much attention to the metric, and not everything else?

I think it's because deciding what metrics to include on a customer survey feels important but doesn't require much hard work. Everyone can advocate for their favorite survey questions, there can be a lively discussion, and an executive can weigh in to break the tie. You get to feel like you had a productive day.

But getting everything else right takes a lot more time, effort, and attention. A successful customer feedback program needs to be actively managed, it requires ongoing executive attention and support, and it requires constant tweaking to adjust to the changing dynamics of the business. You can't just throw money at it.

Most people think that the scarcest resource at most companies is money. That's often true in small companies, but most large companies have plenty of money to invest in the things they think are most important. In big companies, often the hardest resource to obtain is attention--especially executive attention.

So it's easier to pretend that the customer feedback program doesn't need any ongoing attention, that you can make the big decisions once and be done with them.

But that's not a recipe for success. Success requires focusing on the rest of the customer feedback process, and making it an ongoing priority. That's hard. And that's why people spend so much time worrying about the metric.

Rethinking the CSR

Traditionally, the job of the Customer Service Representative (CSR) was to take requests and orders from customers and generally handle transactions as cost-effectively as possible.

Today, most routine transactions are handled through self-service. Most customers prefer to use a company's website instead of calling on the phone, especially for simple stuff. It is more and more often the case that the CSR is handling complicated transactions, situations where the self-service didn't work, and cases where the customer needs to be doubly certain that his problem will be taken care of.

What will the job of the CSR be like in the future, when nearly all customers take care of their business online? CSRs will be left with nothing but the more complex and high-stakes problems. I think this will lead to the job becoming less the traditional CSR, and more like a Customer Advocate.

The difference is that where the CSR represents the company to the customer, the role of a Customer Advocate is to represent the customer within the company. For example:

Types of Interactions

  • Customer Service Representative: Most transactions are routine, and this is the customer's first attempt to solve the problem.
  • Customer Advocate: Most transactions are exceptions to the normal process, and the customer has already tried other ways to solve the problem.

When Multiple Calls Are Required

  • Customer Service Representative: Different CSRs will handle the customer's multiple calls. Each CSR will have to take time to become familiar with the customer's case.
  • Customer Advocate: The same Customer Advocate will keep working with the customer until the problem is solved (or it becomes clear no resolution is possible).

Measuring Performance

  • Customer Service Representative: Measured on how efficiently the CSR can handle a large volume of transactions.
  • Customer Advocate: Measured on how effectively the Customer Advocate finds solutions which are acceptable to both the customer and the company.

When a Customer Wants Special Treatment

  • Customer Service Representative: Will generally enforce company policy, but may be empowered to make limited exceptions.
  • Customer Advocate: Will explain the policy to the customer, and help argue the customer's case for an exception. Authority to enforce policy and make exceptions resides elsewhere in the organization.

General Role

  • Customer Service Representative: Interfaces between the customer and the company's internal business processes (order entry, billing, etc.).
  • Customer Advocate: Interfaces between the customer and the company's internal structure and decision-making processes (management, other organizational silos, etc.).

It's still going to be a long time before the call center's job is primarily handling these more complex interactions, but it is starting to happen today. I'm seeing more companies moving away from efficiency-based metrics (like calls per hour) and towards outcome-based metrics (resolution, satisfaction, and related metrics). I'm also seeing more companies questioning the assumption that most calls are coming from customers who haven't attempted self-service.

So the role of the CSR is clearly shifting, whether the job description is or not.

Our Valued Customers

Never publically acknowledged, it can be found in many, if not most, call centers. Passed furtively from employee to employee over the years, it might get dragged out when the office party gets really wild.

Yes, I'm talking about the secret recording of the "best" customer calls. And by "best," of course, I really mean "entertainingly worst."

Our Valued Customers takes it a step further. Tim Chamberlain works at a comic book store, and he takes it upon himself to illustrate and save for posterity many of the weird and whacky things he hears customers saying.

Just one more reason to always be polite, just like your mother taught you.

Delta Says: Please Take This Survey, Even Though We Won't Read It

Sunday morning I flew Delta from Minneapolis to Atlanta for a conference.

Delta cares enough about my opinion of their airplanes that on Tuesday afternoon--over two days after I stepped off the flight--they e-mailed me an invitation to take a survey. Delta's invitation read, in part:

Your feedback on this experience is important to us. We thank you in advance for your input.

Followed with:


In other words, if you want someone to actually read what you write and take some action, it's not going to happen here.

I give Delta credit for a certain level of honesty with its customers, but I really have to wonder:

  • Why would a customer do a survey which the company acknowledges up front (in all caps no less) won't result in any response?
  • Why would a company spend the money (however small) to perform a survey which they can't respond to?

The survey itself is even weirder. There were ten questions plus a comment area, and all the questions were about the cleanliness of different parts of the airplane. I was asked to rate the condition of the carpet, walls, reading lights, lavatory, etc., etc., but there is not a single question about customer service, the boarding process, or even how I feel about Delta overall.

I'm truly at a loss as to what Delta hopes to get out of this. The only theory I have so far (and it's a bit farfetched) is that Delta cut back on cleaning its airplanes, and some executive wants to know how filthy they are. Rather than doing the obvious, like asking the cabin crews, they decided to send customers an e-mail two days after the flight.

So Delta, here's my feedback, even though I know you're not really paying attention: The cabin of the plane was fine, or at least not so disgusting that I would remember it two days later. But my wife will tell you I'm not always the best person to ask about cleanliness, and anyway, I generally try to suppress my memories of commercial air travel. As for the rest of the experience, nobody was actively rude to me. On the other hand, nobody did anything to mitigate the general unpleasantness of waking up before dawn on a Sunday just to spend two hours locked in a pressurized aluminum tube, unable to move more than an inch in any direction. So let me know when you've got your priorities straightened out, and maybe I'll stop trying to avoid flying Delta.

Here's How to Lose a Customer

At home, I get my Internet service from Comcast. In my neighborhood the only other major option is CenturyLink, which is slower but cheaper. I've had this cable modem service for well over a decade (it started out as Roadrunner, but that's a different story), and since I rarely have needed to call for service or support, I am an extremely profitable customer. I would guess that over the years, I have generated in excess of $5,000 in profit and free cash flow for my cable modem provider.

Here's a set of step-by-step instructions Comcast could follow to lose my very profitable business, if they wanted to:

  1. Upgrade the network to all-digital, requiring subscribers like me to get a digital adapter for our basic cable subscriptions.
  2. Send out a letter informing me that I need to get a free digital adapter. Provide a web site and code to use.
  3. Ensure that no matter what I do, entering the code into the website generates an error.
  4. Ensure that the "online chat" for support on the website doesn't work.
  5. Send another letter a month later. Make sure the web site and chat still don't work.
  6. In fact, send out four or five letters at roughly one month intervals to make sure I have a stack of them for reference while the website remains broken.
  7. When the deadline gets uncomfortably close and I decide to brave phone support, hang up on me when the system tries to transfer me to an agent.
  8. When I persist and finally get through to an agent, make sure that the agent cannot help because the account is in my wife's name and she's not home at the moment.
  9. After hanging up with the agent, call me back for an automated survey. Make sure I have to wade through over five minutes (twice as long as the customer service call itself) of robo-questions before being allowed to leave a message describing my actual experience. Do everything possible to ensure I have zero confidence anyone will listen to my message.

So I haven't left Comcast yet (but I am exploring options, which I haven't done in years).

But things like this continue to happen at big companies all the time. These systems and processes are clearly broken, almost comically bad. They not only frustrate customers (and put significant amounts of revenue at risk), it's significantly more expensive to provide service this way. What should have been a simple self-service transaction costing Comcast almost nothing has evolved into a lengthy multi-step process involving multiple letters, web site visits, phone calls, and now (with this blog entry) bad publicity.

The prescription for a company like this is at once simple and difficult: pay attention to your customers. I have to assume I'm not the only Comcast customer in this situation, yet there are no signs that anybody at the company is paying the slightest bit of attention to everything I've been going through. The signs are all there, if they would only choose to look.

Vocalabs Newsletter Published

Issue 66 of Quality Times, our regular online newsletter, has been published. In this issue we discuss integrating live customer interviews into a Salesforce workflow, and the hazards of forcing customers to choose an option in a survey. E-mail subscribers should be getting their copies shortly.

Integrating Customer Interviews into Salesforce Workflows

I'm pleased to announce the availability of Vocalabs' immediate live customer interviews integrated into workflows.

This means that any event in Salesforce can trigger an immediate call from one of Vocalabs' professional interviewers to get customer feedback. You can trigger a survey when a customer calls for service, when a trouble ticket is closed, after an installation is complete, whatever you want. The survey call can happen in as little as three to five minutes, or at a later time if that's more appropriate. You will get real-time data as interviews are completed, complete with interview recordings, alerts and notifications, and our unique interactive reporting tool.

Because this is Salesforce, setup is simple: just add an outbound notification to the Salesforce workflow. We will design an interview script tailored to your unique needs, and manage the survey process from start to finish.

This gives you the simplicity and immediacy of an e-mail or IVR survey, but with the depth and human touch only a live interview can deliver.

I'm very excited about this new service we're offering. Never before has it been so simple to collect such deep, immediate feedback, and deliver it to the places in your organization where it can have the most impact. I hope you agree. Please contact us at to talk about how we can get you started today. And I really mean today.

Stop the Net Promoter Madness!

Net Promoter is a trendy way to measure how well a company is doing in its customers' eyes. It uses a single question, "How likely are you to recommend the company to your friends or colleagues" on a zero-to-ten scale, and subtracts the 6 and below from the 9's and 10's. Voila, the net promoter score.

The Net Promoter Score has some good things going for it: it's highly standardized, so benchmarking is easy. It's easy to understand and gives you a single number to focus on. And it's proponents claim it correlates well with useful things like customer loyalty and word-of-mouth.

But Net Promoter only really measures one thing, the customer's overall relationship with the company. And that's where things start to go off the rails. Many companies actually want to measure a lot of different things: how did that specific customer service call go, how was the salesperson, did the automated system work, and so on.

Rather than develop survey questions to measure those specific things, some companies try to adapt the Net Promoter question to fit. The results are not always pretty.

For example: How likely are you to recommend this customer service representative to your friends or colleagues? You can see the logic for the company trying to standardize on Net Promoter and measure the CSR's performance. The problem is that at most companies the customer has no choice in who they speak to when they call customer service. So there's no point in recommending the agent, and a certain percentage of customers will give a zero on the question for exactly that stated reason. That's not a problem with the CSR, that's a problem with the survey question.

Or worse: How likely are you to recommend this customer service representative's knowledge of products and services to your friends and colleagues? This question, as written, is meaningless. The intent is to understand the CSR on different qualities (knowledge, friendliness, eagerness, etc.), but you can't really recommend a person's particular skill in a vacuum. You can't say to your friend or colleague, "I recommend you talk to Sally's knowledge and Bob's friendliness, but Sarah's efficiency."

Fortunately most people catch on to the fact that these questions should not be taken literally, and that prevents the data from being completely useless.

But if you want customers to interpret your survey question in a way which has them answering a different question than the one you asked, why not just ask the question you want them to answer?

It's much easier to interpret the answer to a straightforward question like, "Please rate the customer service representative's knowledge of products and services."

So while Net Promoter has its place, don't try to fit that round peg into every square, octagonal, or star-shaped hole you encounter. Just ask the question you want customers to answer.

Caution: Big Data Ahead

"Big Data" is a fashionable buzzword these days. It refers to the practice at many companies (especially Internet companies) to collect insanely massive data sets by permanently storing pretty much everything. Google, for example, stores nearly everything anyone ever does on any Google website and any site whch uses Google advertising or analytics. That's a lot of data.

Companies do this not to be creepy (though it certainly is that), but because they believe they can use this massive data set to tease out patterns of user behavior. More data equals more insights, right?

Nassim Taleb published an editorial in Wired a few days ago called "Beware the Big Errors of Big Data." There are a few problems with the "let's throw more data at it" approach to analysis:

  • First, no data set is perfect. Even Google's online panopticon is rife with missing data and errors, because it can't perfectly connect the actions of a person to the individual. A recent study showed that the great-granddaddies of Big Data, credit bureaus, have significant mistakes (i.e. bad enough to change someone's credit score) on 20% of records. Any large statistical analysis is going to have to be wary that the insights reflect real patterns of human behavior, and not patterns of systematic errors in the underlying data. This can be subtle and difficult to detect.
  • Then there's the data mining problem. The beauty of statistical analysis of very large data sets is it lets us test vast quantities of hypotheses to see whether there's a relationship. The problem is that the more relationships you test, the more false positives you get because of statistical flukes.

That's not to say that Big Data isn't useful, just that it has its limits. By themselves, large data sets only let us establish patterns of correlation between things: "If A happens, B is also likely to happen."

Correlation is the weakest possible relationship between things. It doesn't tell us whether A causes B, whether B causes A, whether A and B are both caused by some other underlying factor C, or whether it's just a coincidence. Establishing that A causes B requires a different kind of data and not just more of the same data: perhaps a randomized trial, or (better yet) a randomized trial with a theory for the underlying mechanism.

So while Big Data is good, it can only go so far. Be aware of its limits.

Great Product, When it Works

When it comes to customer service, most companies have one of two basic attitudes.

The first, and less common, is to take the attitude that customer service is an inherent part of what the company sells, and that have a great product or service requires having great customer service. For example, Apple and Zappos.

The other, more common, attitude is that customer service is an unfortunate cost of doing business, and while customers sometimes need help, everything would be much simpler if they just stopped being so demanding. Chances are your bank or mobile phone company fall in this category.

And then there's Google, which is in a category all its own. At least with some of its products, Google seems to believe that customers don't actually need any customer service. Where some companies have made it difficult to talk to a person, Google has stopped playing games and simply doesn't provide anyone to talk to.

(I should add that my opinion is based on what I've seen of Google's consumer-oriented products--I would assume that they have figured out the necessity of providing a helpdesk for enterprise services like gmail for businesses.)

Today's dose of confirming evidence comes from Google Voice. Google Voice is a nifty service which lets you set up one phone number which will forward to multiple different phones, provide voicemail transcription, and let you set up some call routing rules. I tried it for a while several years ago, but didn't want to take the risk of porting my phone number to Google and have things go wrong.

That was a good decision, it seems, since Consumerist reported this morning that people porting their phone numbers to Google Voice have been having problems. Callers would get a message that the number has been disconnected, and this has been going on at least since Saturday.

Google Voice apparently provides no customer service options other than an online forum which is not intensively monitored by Google staff. Complaints have been stacking up for two days (including one person who reported his phone number has been out of service for a month) with no response from anyone at Google. The first official response from Google was, as near as I can tell, several hours after the Consumerist article ran. And that was just one staffer posting that he was "investigating" and would report back when he knew more.

This number porting service from Google costs money, so this is not a case of a free service being worth what you paid for it. Google Voice has left paying customers unable to receive incoming phone calls (perhaps for as long as a month), with no obvious way to complain or open a trouble ticket, and no response of any sort from the company until after the problem was written up in a major online media outlet. That's a service level which would have shocked even Ernestine the Telephone Operator.

And stories like this are why, even though I think Google has a lot of great services, I don't trust them for anything really important to me.

Issue 65 of Quality Times is Published

We just published issue 65 of Quality Times, Vocalabs' periodic newsletter. E-mail subscribers should be receiving their copies shortly. In this issue we discuss the just-released Executive Summary data for the National Customer Service Survey in 2012.

As always, I hope you find it interesting and informative.

Syndicate content