By Peter DeHaan, PhD
I group surveys into four categories:
Market Research: The first type of survey is market research. Those who complete the survey have a chance to win some appealing prize. Though enticed by the possible reward, most of the time I can’t complete the survey because I fall outside their target demographic or the survey ends prematurely when I give an unacceptable answer. Because of this I’ve stopped taking market research surveys. Besides, I wonder if anyone actually wins the prize.
The Sales Call: This ploy is a sales pitch disguised as a survey, which is illegal. If you answer their questions correctly – that is, identify yourself as a prospect – you receive a sales pitch. These surveys, often presented as research, appeal to one’s sense of civic duty or the opportunity to influence some important decision. They’ve duped me too many times, so I skip these surveys, too.
Skewed by Employees: After completing a transaction, the salesperson or customer service rep implores me to take their survey, usually in a most emphatic manner. Often they imply that they could get in trouble if I don’t. Once I agree they tell me how to answer. “Make sure you give me all fives,” they say. “Anything less – even a four – is a failure.” Their bonus or even their job is at stake. Will I help them?
Masterful at stating their plea, it’s hard not to comply. But their effort to game the system disgusts me. My wish is to respond with all ones. My wife says that’s a terrible thing to do; I say it’s terrible for employees to mislead their employer by steering the results.
Company Centric: The final type of survey is also a customer service evaluation, but when considering the questions, it seldom truly addresses the customer and actually focuses on the company.
Many common questions – such as hold time, speed to answer, first call resolution, agent courtesy, and so forth – appear to address customer service issues but miss doing so. In reality, these feed into some corporate metric assumed to measure customer service. Call centers can achieve statistical goals yet still provide poor service.
If I care about the organization, I sometimes complete these last two types of surveys. I want to help them become better. However, don’t ask if I was placed on hold, had to wait too long, needed to make multiple calls, or am willing to recommend them. Simply ask if they fully addressed the reason for my call. The next item should inquire if I’m happy with how they served me. Don’t assume that certain metrics address this; just ask if I’m pleased. For the final item, provide an option for additional comments. Surveys imply a desire to hear what customers think, so they should provide an opportunity to share.
So here’s the survey I’d like to take but haven’t seen yet:
1) Did we accomplish the reason for your call?
2) Are you pleased with how we did?
3) Do you have anything to add?
Thanks for asking.
Peter DeHaan is the publisher and editor-in-chief of AnswerStat magazine and a passionate wordsmith. Connect with him on his blogs, social media, and newsletter, all accessible at www.authorpeterdehaan.com.
[From AnswerStat – October/November 2015]