The Customer Service Survey

Vocalabs' Blog


Our Valued Customers

Never publically acknowledged, it can be found in many, if not most, call centers. Passed furtively from employee to employee over the years, it might get dragged out when the office party gets really wild.

Yes, I'm talking about the secret recording of the "best" customer calls. And by "best," of course, I really mean "entertainingly worst."

Our Valued Customers takes it a step further. Tim Chamberlain works at a comic book store, and he takes it upon himself to illustrate and save for posterity many of the weird and whacky things he hears customers saying.

Just one more reason to always be polite, just like your mother taught you.

Delta Says: Please Take This Survey, Even Though We Won't Read It

Sunday morning I flew Delta from Minneapolis to Atlanta for a conference.

Delta cares enough about my opinion of their airplanes that on Tuesday afternoon--over two days after I stepped off the flight--they e-mailed me an invitation to take a survey. Delta's invitation read, in part:

Your feedback on this experience is important to us. We thank you in advance for your input.

Followed with:


In other words, if you want someone to actually read what you write and take some action, it's not going to happen here.

I give Delta credit for a certain level of honesty with its customers, but I really have to wonder:

  • Why would a customer do a survey which the company acknowledges up front (in all caps no less) won't result in any response?
  • Why would a company spend the money (however small) to perform a survey which they can't respond to?

The survey itself is even weirder. There were ten questions plus a comment area, and all the questions were about the cleanliness of different parts of the airplane. I was asked to rate the condition of the carpet, walls, reading lights, lavatory, etc., etc., but there is not a single question about customer service, the boarding process, or even how I feel about Delta overall.

I'm truly at a loss as to what Delta hopes to get out of this. The only theory I have so far (and it's a bit farfetched) is that Delta cut back on cleaning its airplanes, and some executive wants to know how filthy they are. Rather than doing the obvious, like asking the cabin crews, they decided to send customers an e-mail two days after the flight.

So Delta, here's my feedback, even though I know you're not really paying attention: The cabin of the plane was fine, or at least not so disgusting that I would remember it two days later. But my wife will tell you I'm not always the best person to ask about cleanliness, and anyway, I generally try to suppress my memories of commercial air travel. As for the rest of the experience, nobody was actively rude to me. On the other hand, nobody did anything to mitigate the general unpleasantness of waking up before dawn on a Sunday just to spend two hours locked in a pressurized aluminum tube, unable to move more than an inch in any direction. So let me know when you've got your priorities straightened out, and maybe I'll stop trying to avoid flying Delta.

Here's How to Lose a Customer

At home, I get my Internet service from Comcast. In my neighborhood the only other major option is CenturyLink, which is slower but cheaper. I've had this cable modem service for well over a decade (it started out as Roadrunner, but that's a different story), and since I rarely have needed to call for service or support, I am an extremely profitable customer. I would guess that over the years, I have generated in excess of $5,000 in profit and free cash flow for my cable modem provider.

Here's a set of step-by-step instructions Comcast could follow to lose my very profitable business, if they wanted to:

  1. Upgrade the network to all-digital, requiring subscribers like me to get a digital adapter for our basic cable subscriptions.
  2. Send out a letter informing me that I need to get a free digital adapter. Provide a web site and code to use.
  3. Ensure that no matter what I do, entering the code into the website generates an error.
  4. Ensure that the "online chat" for support on the website doesn't work.
  5. Send another letter a month later. Make sure the web site and chat still don't work.
  6. In fact, send out four or five letters at roughly one month intervals to make sure I have a stack of them for reference while the website remains broken.
  7. When the deadline gets uncomfortably close and I decide to brave phone support, hang up on me when the system tries to transfer me to an agent.
  8. When I persist and finally get through to an agent, make sure that the agent cannot help because the account is in my wife's name and she's not home at the moment.
  9. After hanging up with the agent, call me back for an automated survey. Make sure I have to wade through over five minutes (twice as long as the customer service call itself) of robo-questions before being allowed to leave a message describing my actual experience. Do everything possible to ensure I have zero confidence anyone will listen to my message.

So I haven't left Comcast yet (but I am exploring options, which I haven't done in years).

But things like this continue to happen at big companies all the time. These systems and processes are clearly broken, almost comically bad. They not only frustrate customers (and put significant amounts of revenue at risk), it's significantly more expensive to provide service this way. What should have been a simple self-service transaction costing Comcast almost nothing has evolved into a lengthy multi-step process involving multiple letters, web site visits, phone calls, and now (with this blog entry) bad publicity.

The prescription for a company like this is at once simple and difficult: pay attention to your customers. I have to assume I'm not the only Comcast customer in this situation, yet there are no signs that anybody at the company is paying the slightest bit of attention to everything I've been going through. The signs are all there, if they would only choose to look.

Vocalabs Newsletter Published

Issue 66 of Quality Times, our regular online newsletter, has been published. In this issue we discuss integrating live customer interviews into a Salesforce workflow, and the hazards of forcing customers to choose an option in a survey. E-mail subscribers should be getting their copies shortly.

Integrating Customer Interviews into Salesforce Workflows

I'm pleased to announce the availability of Vocalabs' immediate live customer interviews integrated into workflows.

This means that any event in Salesforce can trigger an immediate call from one of Vocalabs' professional interviewers to get customer feedback. You can trigger a survey when a customer calls for service, when a trouble ticket is closed, after an installation is complete, whatever you want. The survey call can happen in as little as three to five minutes, or at a later time if that's more appropriate. You will get real-time data as interviews are completed, complete with interview recordings, alerts and notifications, and our unique interactive reporting tool.

Because this is Salesforce, setup is simple: just add an outbound notification to the Salesforce workflow. We will design an interview script tailored to your unique needs, and manage the survey process from start to finish.

This gives you the simplicity and immediacy of an e-mail or IVR survey, but with the depth and human touch only a live interview can deliver.

I'm very excited about this new service we're offering. Never before has it been so simple to collect such deep, immediate feedback, and deliver it to the places in your organization where it can have the most impact. I hope you agree. Please contact us at to talk about how we can get you started today. And I really mean today.

Stop the Net Promoter Madness!

Net Promoter is a trendy way to measure how well a company is doing in its customers' eyes. It uses a single question, "How likely are you to recommend the company to your friends or colleagues" on a zero-to-ten scale, and subtracts the 6 and below from the 9's and 10's. Voila, the net promoter score.

The Net Promoter Score has some good things going for it: it's highly standardized, so benchmarking is easy. It's easy to understand and gives you a single number to focus on. And it's proponents claim it correlates well with useful things like customer loyalty and word-of-mouth.

But Net Promoter only really measures one thing, the customer's overall relationship with the company. And that's where things start to go off the rails. Many companies actually want to measure a lot of different things: how did that specific customer service call go, how was the salesperson, did the automated system work, and so on.

Rather than develop survey questions to measure those specific things, some companies try to adapt the Net Promoter question to fit. The results are not always pretty.

For example: How likely are you to recommend this customer service representative to your friends or colleagues? You can see the logic for the company trying to standardize on Net Promoter and measure the CSR's performance. The problem is that at most companies the customer has no choice in who they speak to when they call customer service. So there's no point in recommending the agent, and a certain percentage of customers will give a zero on the question for exactly that stated reason. That's not a problem with the CSR, that's a problem with the survey question.

Or worse: How likely are you to recommend this customer service representative's knowledge of products and services to your friends and colleagues? This question, as written, is meaningless. The intent is to understand the CSR on different qualities (knowledge, friendliness, eagerness, etc.), but you can't really recommend a person's particular skill in a vacuum. You can't say to your friend or colleague, "I recommend you talk to Sally's knowledge and Bob's friendliness, but Sarah's efficiency."

Fortunately most people catch on to the fact that these questions should not be taken literally, and that prevents the data from being completely useless.

But if you want customers to interpret your survey question in a way which has them answering a different question than the one you asked, why not just ask the question you want them to answer?

It's much easier to interpret the answer to a straightforward question like, "Please rate the customer service representative's knowledge of products and services."

So while Net Promoter has its place, don't try to fit that round peg into every square, octagonal, or star-shaped hole you encounter. Just ask the question you want customers to answer.

Caution: Big Data Ahead

"Big Data" is a fashionable buzzword these days. It refers to the practice at many companies (especially Internet companies) to collect insanely massive data sets by permanently storing pretty much everything. Google, for example, stores nearly everything anyone ever does on any Google website and any site whch uses Google advertising or analytics. That's a lot of data.

Companies do this not to be creepy (though it certainly is that), but because they believe they can use this massive data set to tease out patterns of user behavior. More data equals more insights, right?

Nassim Taleb published an editorial in Wired a few days ago called "Beware the Big Errors of Big Data." There are a few problems with the "let's throw more data at it" approach to analysis:

  • First, no data set is perfect. Even Google's online panopticon is rife with missing data and errors, because it can't perfectly connect the actions of a person to the individual. A recent study showed that the great-granddaddies of Big Data, credit bureaus, have significant mistakes (i.e. bad enough to change someone's credit score) on 20% of records. Any large statistical analysis is going to have to be wary that the insights reflect real patterns of human behavior, and not patterns of systematic errors in the underlying data. This can be subtle and difficult to detect.
  • Then there's the data mining problem. The beauty of statistical analysis of very large data sets is it lets us test vast quantities of hypotheses to see whether there's a relationship. The problem is that the more relationships you test, the more false positives you get because of statistical flukes.

That's not to say that Big Data isn't useful, just that it has its limits. By themselves, large data sets only let us establish patterns of correlation between things: "If A happens, B is also likely to happen."

Correlation is the weakest possible relationship between things. It doesn't tell us whether A causes B, whether B causes A, whether A and B are both caused by some other underlying factor C, or whether it's just a coincidence. Establishing that A causes B requires a different kind of data and not just more of the same data: perhaps a randomized trial, or (better yet) a randomized trial with a theory for the underlying mechanism.

So while Big Data is good, it can only go so far. Be aware of its limits.

Great Product, When it Works

When it comes to customer service, most companies have one of two basic attitudes.

The first, and less common, is to take the attitude that customer service is an inherent part of what the company sells, and that have a great product or service requires having great customer service. For example, Apple and Zappos.

The other, more common, attitude is that customer service is an unfortunate cost of doing business, and while customers sometimes need help, everything would be much simpler if they just stopped being so demanding. Chances are your bank or mobile phone company fall in this category.

And then there's Google, which is in a category all its own. At least with some of its products, Google seems to believe that customers don't actually need any customer service. Where some companies have made it difficult to talk to a person, Google has stopped playing games and simply doesn't provide anyone to talk to.

(I should add that my opinion is based on what I've seen of Google's consumer-oriented products--I would assume that they have figured out the necessity of providing a helpdesk for enterprise services like gmail for businesses.)

Today's dose of confirming evidence comes from Google Voice. Google Voice is a nifty service which lets you set up one phone number which will forward to multiple different phones, provide voicemail transcription, and let you set up some call routing rules. I tried it for a while several years ago, but didn't want to take the risk of porting my phone number to Google and have things go wrong.

That was a good decision, it seems, since Consumerist reported this morning that people porting their phone numbers to Google Voice have been having problems. Callers would get a message that the number has been disconnected, and this has been going on at least since Saturday.

Google Voice apparently provides no customer service options other than an online forum which is not intensively monitored by Google staff. Complaints have been stacking up for two days (including one person who reported his phone number has been out of service for a month) with no response from anyone at Google. The first official response from Google was, as near as I can tell, several hours after the Consumerist article ran. And that was just one staffer posting that he was "investigating" and would report back when he knew more.

This number porting service from Google costs money, so this is not a case of a free service being worth what you paid for it. Google Voice has left paying customers unable to receive incoming phone calls (perhaps for as long as a month), with no obvious way to complain or open a trouble ticket, and no response of any sort from the company until after the problem was written up in a major online media outlet. That's a service level which would have shocked even Ernestine the Telephone Operator.

And stories like this are why, even though I think Google has a lot of great services, I don't trust them for anything really important to me.

Issue 65 of Quality Times is Published

We just published issue 65 of Quality Times, Vocalabs' periodic newsletter. E-mail subscribers should be receiving their copies shortly. In this issue we discuss the just-released Executive Summary data for the National Customer Service Survey in 2012.

As always, I hope you find it interesting and informative.

Why Online Customer Service is Better Than Phone

Online customer service may not actually provide better service than the phone...but when you get hung up on, you get to keep the proof.

From Consumerist.

Are IVR Polls Cheating?

Predicting the outcome of an election is one of the hardest things a commercial survey company is likely to do. It's hard to predict people's future behavior, and with election polling there's always a definite outcome at the end. Failure is public and obvious. In the last election, Gallup clearly lost some of its mystique by predicting Romney would win.

IVR (aka robocall) surveys are some of the most challenging surveys to get right. Customers hate them, it's hard to get participation, self-selection bias is high, and in some states they're illegal. Pretty much the only reason to use an IVR survey it it's much cheaper than a live interview.

Despite these challenges, IVR surveys have had a pretty good track record predicting elections--about as good as live interview surveys, at any rate.

But some new research suggests that the companies running IVR surveys may be fudging their numbers in political polling to make them look better (via Kevin Drum). A statistical analysis of 2012 primary election survey shows that IVR surveys had about the same error rate as human interviews--but only if there was also a live-interview survey conducted on the same race before the IVR survey. If there was no live-interview survey, the IVR surveys had much larger errors in predicting the election outcome.

The obvious interpretation of this is that the people publishing the IVR surveys were fudging their numbers to better match the previously published live-interview surveys. Or, since nobody wants to stick their neck out, they might have simply chosen to not publish surveys which were too far out of line with what other surveys were showing.

Another possible interpretation is that the contests where there was no human polling were generally not interesting enough to bother fielding large surveys--in which case a larger error would be expected.

I am not a fan of IVR surveys in general, and I would expect that they would have higher error rates than human interviewers--the more surprising result to me would be if IVR surveys were truly just as accurate as live surveys. Nevertheless, someone has some explaining to do.

Does $50M in Marketing Overcome Bad Customer Service?

Ira Kalb reports today in BusinessInsider (via Consumerist) that Time Warner Cable is spending $50 million on a marketing campaign to try to win back customers who have left the company over the past few years. Kalb thinks a big part of the reason why Time Warner was losing customers in the first place is because the company has garnered such a terrible reputation for bad customer service.

I don't know if this is true or not, but it certainly seems plausible to me. At Vocalabs we pulled the plug on Comcast for Internet service last year, and customer service was a big part of the reason. We're actually spending a couple bucks more each month to work with a different ISP, a local company lacking Comcast's poor reputation.

I've often said that the most cost-effective marketing dollar is usually spent on improving customer service, but it rarely gets demonstrated in such stark terms. $50M is a lot of money to spend winning back customers (if it even works), but Time Warner has been losing a lot of customers. Perhaps if they had spent some of that money a few years ago to improve their service, the story would be different today. 

Finding a way

Corporate policies sometimes clash in unexpected ways. A new mobile phone comes with a warranty for a short period of time which will get you a replacement phone if a manufacturing defect shows up. You also have the option of buying insurance, which will replace the phone if it's lost, stolen, or broken.

But if a manufacturing defect doesn't appear until after the warranty expires, the insurance won't help and you're stuck. This strikes most people as weird, if not completely irrational.

What to do when faced with corporate irrationality? There are creative solutions.

The 30-Day Month

Over the weekend I noticed an ad in our local newspaper for WalMart's new prepaid iPhone plans. The headline announced in 4-inch high letters and retina-searing colors that you could get a new iPhone 5 and pay just "$45/Month*" for the service. As anyone with the consumer savvy of a third-grader knows, in an ad like this the most interesting part inevitably comes after the asterisk.

The asterisk led to some tiny type at the bottom which disclosed that, for purposes of this offer, a month is 30 days.

Big deal? Maybe. It depends on how you feel about sneaky price increases.

Because the fact is that an average month is not 30 days. There are 12 months in a 365-day year, but only 360 days in 12 30-day periods. That five-day difference is the equivalent to about a 1.4% price increase, because in any given year about one in six customers actually pays for 13 "months" during the 12 month period.

So on average, the "$45/month" plan actually costs about $45.62 per calendar month. Most customers will pay twelve times $45, or $540 in any given year. But some will pay $585.

Interesting, yes, and kind of an obnoxious anti-consumer move. But it made me curious, since I'm on a T-Mobile prepaid plan and T-Mobile charges for celendar months as most people would expect. My plan renews on the same day each month.

With a little research I found that T-Mobile, Virgin Mobile, and Boost Mobile all charge for calendar months. AT&T, Verizon, and WalMart's StraightTalk are all based on 30-day "months" and therefore have the hidden 1.4% price increase. On their websites, Verizon and StraightTalk describe their plans as "30-day" plans and not "monthly," so at least they technically disclose the plan even if most consumers don't do the math to figure out the implications.

The special raspberry award goes to AT&T, which describes its prepaid plans as "monthly." I had to delve into the legalese to find a clear indication that AT&T's "month" is not the same as yours and mine.

Why do they do this? Undoubtably, AT&T, Verizon, and WalMart have some PR-friendly explanation about how short-changing customers by 1.4% is somehow in the customer's interest. I think it's much more likely that they thought they could get away with it. For a $50/month prepaid plan, this amounts to about $8/year per subscriber on average. Multiply by millions of prepaid subscribers and it starts to look like real money.

Maybe this isn't the most egregious way mobile phone companies sneak extra charges onto your bill. But it is one of the more subtle. So shame on them. And buyer beware.

Vocalabs Newsletter 64

Issue 64 of our newsletter, Quality Times, has been published. In keeping with the holiday spirit, this month we are publishing a memo leaked to us from a source at the North Pole about some important changes to the naughtiness metrics.

Forcing A Response

One of the more-abused practices in survey design is forcing a response to a particular question.

You've probably experienced this: when doing a survey (usually online), you leave a question blank and try to continue. Rather than going to the next question, the survey highlights the blank question and requires you to answer before continuing.

In theory this might be OK if you were highly confident that the list of options you gave covered every possible customer and situation. Even then, I think there are legitimate reasons why someone might decline to answer almost any survey question.

In the real world, of course, this is not what happens. All too often there's something the survey writer didn't think of, and the customer is forced to answer a question she feels she can't. The customer will do one of two things: abandon the survey (which is bad), or make something up (which is worse). Either way, the survey isn't getting the data you wanted.

I encountered this just today. I was taking a customer survey for an organization I've done business with in the past few years, and I did have some feedback to offer--specific products I would like to see them offer in the future. About the third question on the survey asked me which products I'd purchased in the past two years, and offered a list of about 20 different products. The stuff I'd actually bought wasn't on the list.

Naturally there was no option for "some other product," and the survey wouldn't let me continue without selecting something.

In this case, I abandoned the survey. In the past I've been known (to my eternal shame) to just make stuff up in order to be given the chance to give the company a piece of my mind.

This is why I think that forcing the customer to respond to a particular survey question is almost always bad practice. You may decide to discard some surveys after the fact becuase the customer skipped too many questions--but at least give the customer the opportunity to provide what feedback she wants.

Pretty Good Practice: Tie Feedback to Future Behavior

One Pretty Good Practice I want to see all our clients adopt is tying customer feeback to future behavior.

This involves keeping customer survey records active, and updating them as the customer makes new buying decisions. For example, when a customer who did a customer survey in the past buys a new product, closes an account, upgrades, etc., the new transaction is added to the old survey record.

That makes the customer survey not just a snapshot of the customer's opinions at one moment in time, but an ever-expanding longitudinal data set tracking how customer opinions affect future behavior.

Once this data set exists, it's straightforward to analyze it to determine strategic drivers like just how much loyalty a bad customer service experience costs, which customers might be ripe for a new purchase, and what early warning signs to look out for in a customer who is considering leaving.

These are things most companies can only guess and estimate. Having this hard data allows making much smarter decisions about where to allocate resources, knowing with precision what customers like to complain about and what really drives their behavior.

On This Day in History

Is your customer survey still asking the same questions as five years ago?

Five years ago, Twitter was a startup and Facebook had only recently opened to the public at large. George W. Bush was president, and the iPhone had only been on the market for a few months. Chances are your business is very different than it was five years ago.

I regularly talk to companies whose customer feedback programs have not changed at all in five or ten years--sometimes even longer. Ten years ago Google was still a startup--it's hard to believe that any business process developed then and unchanged since is meeting today's needs.

The usual excuse is that it's important to maintain continuity in the survey program. And that's true, up to a point. But if the program doesn't change and evolve with the business, it quickly becomes irrelevant.

As your business needs change, your customer feedback process has to change with it. That means adding questions as needed and removing them when irrelevant, changing who is asked to participate and where the data goes, and regularly reviewing what's working and what isn't.

This has to happen continuously. If a hurricane hits your distribution center, you need the flexibility to ask about how that event is affecting your customers and their expectations of you. When you have a product launch or promotion, the customer feedback process needs to be collecting specific, actionable data about that event.

Customer feedback should not be a passive measurement tool. Ideally, it is a dynamic business process which informs everything else in your company.

Random Randomness

On survey questions were there's a list of choices and you ask the participants to choose the option they prefer, people are somewhat more likely to pick the first or last choice on the list. So, for example, if I ask a question like:

Choose Your Favorite Color:

  • Red
  • Yellow
  • Green
  • Blue

the question would be biased towards Red and Blue being favorite colors.

It's sometimes good practice to randomize the list of choices in order to compensate for this bias. So one person might get Yellow, Blue, Green, Red; someone else might get Blue, Green, Red, Yellow; and so forth.

Of course you don't want to do this all the time. In particular, if the choices have a "natural" order, it's best to present them in that order to make it easier to find the choice you're looking for. It turns out that in practice, most survey questions do have a natural order to the responses--we all love our Likert scales!--and so you don't often see a survey where the choices are randomized.

In particular, don't do this:

Really. Just don't do it.

(From Daily WTF)

Vocalabs Newsletter 63 is Published

Issue 63 of our newsletter, Quality Times, has been published and sent to e-mail subscribers. The Vocalabs team experienced a surprising number of customer service snafus in the past couple months, so this issue has a couple of our stories along with some of the lessons we take from them.

As always, I hope this is informative and useful.

The View From My Window

In keeping with the Thanksgiving holiday, here's a picture from a couple years ago at about this time of year, taken from my office window at Vocalabs World HQ. The picture quality is poor since all I had available at the moment was a crummy cellphone camera.

The office park we're in abuts a nature preserve, so we're used to having visitors like these wild turkeys from time to time. They are impressive birds in person, especially when they take to the air. It's hard to believe something so big can get off the ground so fast.

Enjoy your Thanksgiving!

Pretty Good Practice: Real-Time Survey Notification

Every time a customer survey is completed, notify relevant employees about the new feedback. For example, front-line supervisors may get notified any time a survey is completed about one of their employees; and a service-recovery team is notified about customers who had a bad experience.

This technique brings immediate feedback to the people who can act upon it. When the customer survey reveals something the company needs to take action on, getting the information to the right team right away helps ensure the customer is properly taken care of. For front-line employees, each survey is a coaching opportunity to either reinforce positive behavior or discourage negative behavior.

In most companies, sending an e-mail with the survey information is perfect. E-mail doesn’t require any new technology or infrastructure, and most people are already paying attention to it.

The power of this technique is that it allows customer-facing employees to get their feedback right away, without having to wait for a summary report, and while the customer interaction is still as fresh as possible.

It can be overdone, though. If someone is getting more than a handful of notifications per day, it’s probably best to prune them down to the most important, and send the rest as a summary report. With too many notifications, people will either ignore them or become unproductive because of the interruptions.

Service Recovery Done Wrong

"Service Recovery" is the fancy term for making things right when they go wrong. It's a powerful tool for improving brand loyalty when done right, but done wrong it can backfire.

Last week I had to travel to Las Vegas for a conference. I stayed at the conference hotel, the New York New York--not the swankest place in Vegas by any stretch, but certainbly a respectable hotel for a business traveler.

There was only one problem: by the time I arrived at midnight, the hotel was out of nonsmoking rooms in the room type I had reserved. I'm not a very demanding business traveler: I really only insist on two things. Basic cleanliness and a nonsmoking room. In the year 2012 in the United States, these are things every traveler should expect at any hotel.

Normally this is not a problem. I've had this happen to me dozens of times over the years, and the hotel just upgrades me to a nonsmoking room in some other room type. But apparently this is not the policy at the New York New York: they made me pay for an upgrade to a fancy suite as the only way to have me and all my belongings not smell like smoke in the morning. The check-in agent offered no other options (or even much sympathy), so I grudgingly paid for the upgrade.

To step back for a moment, by this point in my customer experience the hotel had earned a black mark in my book. I don't care so much about the dollars, but I do care that they don't consider a nonsmoking room important for guests who want one. Since it is important to me, that's reason enough not to go back (especially with tens of thousands of other hotel rooms within walking distance). But as a customer I'm still recoverable, if the hotel recognizes its mistake and makes amends.

So the next morning I related my story to one of the conference organizers and several other attendees--all of whom were just as surpised as I was. The conference organizer passed this along to the hotel management, and the next day I got a call from one of the guest relations people.

Now we are in the Service Recovery phase. The hotel has been informed that they have an unhappy customer and they are reaching out to me to try to make it right. Research as shown that taking care of a customer's problem properly will actually make a more loyal customer, but not taking care of a problem will lead to a disloyal customer likely to spread negative word-of-mouth. So it's important to do this right, and convert that upset customer into a brand advocate.

The representative from the hotel was very polite, listened to my problem, agreed that I should not have had to pay extra just to get a nonsmoking room. And then she offered to refund one night of the two-night upgrade fee.

Wait, what?

If she agreed that this should not have happened, then why is the hotel only offering to fix half the mistake?

The issue in my mind was never the money so much as the fact that a nonsmoking room is, for me, a basic amenity like clean sheets. From the hotel's response, I can only conclude that my initial conclusion--that they don't think a nonsmoking room is all that important--was correct.

So rather than convince me to return, the hotel actually reinforced my initial negative impression. And like the good Detractor that I am, I am telling everyone--including you, my reader--the same thing: If you care about getting a nonsmoking room, don't stay at the New York New York. Because they don't care, and their actions speak very loudly.

And that is how you do service recovery the wrong way.

A Plethora of Publications

Yesterday we published our latest National Customer Service Survey report, a cross-industry report on the relative strengths and weaknesses of customer service at the eleven companies we follow.

Today we published issue 62 of our newsletter, Quality Times. This issue discusses both the NCSS report and Agile Customer Feedback.

I hope you find these interesting and informative, and please feel free to contact us if you want more information.

Vocalabs Designated Bicycle Friendly Business

I'm pleased to be able to announce that Vocalabs had been designated a Bicycle Friendly Business at the "Bronze" level.

The majority of employees here commute to work via bicycle on a regular basis during the summer months. Winter is a different matter here in Minnesota, but Rob is crazy dedicated enough to ride every month of the year.

The League of American Bicyclists sponsors the Bicycle Friendly Business program, and you can download a complete list of Bicycle Friendly Businesses from their web site.

Syndicate content