3 reasons why your IT customer satisfaction surveys aren’t working

When done well, transactional IT surveys (IT customer satisfaction surveys issued when a support ticket is closed) are a very powerful and cost-effective way to drive continual service improvement and improve IT customer satisfaction.

By the very unscientific method of doing straw poll votes at IT conferences, I estimate about 30% of corporate IT teams don’t do transactional IT surveys at all. 65% do, but they get low response rates (poor customer engagement) and don’t use them to drive lasting improvements. Only 5% do and reap the benefits.

So why is it that so few IT teams are using transactional surveys to actually improve IT customer satisfaction?

IT-customer-satisfaction-piechart

For the 30% who don’t survey at all, it’s either because their IT Service Mangement tool doesn’t have a transactional survey capability. Or because turning on the survey isn’t quick and easy. What questions should we ask? What scale should we use? What do we do with the data? What do all these scary statistical terms mean? And probably because there’s too much firefighting going on to step back and upgrade the hose (apologies to the late Stephen Covey).

But what about the 65% who are issuing surveys when tickets are closed?  Why are they getting single digit response rates? Why aren’t they seeing the IT customer satisfaction improvements that we know are possible? What are they doing (or not doing) that makes their survey efforts nothing more than an annoyance to their customers?

If you’re one of the 65% that’s not seeing the benefits of your transactional surveys, it’s likely to be because of one or more of these three reasons:

1. Poor survey design

If you’re not asking the right questions – or your questions are poorly worded (ambiguous or unclear) – you’re not going to get any insights from the results.

For example, one survey I saw asked for a rating for “Provided you with the computer support that you needed”. What does that even mean? How would you interpret the results?

Another design decision that has an impact on the effectiveness of the survey is the rating scale used.  Although there’s a lot of debate about the pros and cons of various scales, there seems to be general agreement that five, seven or eleven-point (Net Promoter) scales are the way to go.

Smaller scales can look good (think happy face vs sad face) but they just aren’t granular enough. And they tend to make your IT customer satisfaction look better than it is. A whole lot of customers having fairly mediocre experiences are still going to click that happy face.

2. Letting response rates decline

When you spam customers (send too many surveys) with surveys they just start ignoring them.

When the survey takes too long to complete, or doesn’t work on their mobile device, they don’t bother answering them.

And when you aren’t communicating with your customers to show that you’re listening and acting on their feedback, they stop answering them.  My bank regularly asks for my feedback and I’ve never once heard any sort of acknowledgement of my issues or seen any improvement of their service. I just stopped answering their surveys – what’s the point?

3. Not taking action

If the only thing you use transactional surveys for is to calculate a customer satisfaction measure for a management report, your response rates are probably low or declining and your customer satisfaction won’t be increasing.

If you’re not doing these, you’re not going to reap the benefits of your surveys:

  • Dealing with individual customer issues quickly as they arise (and benefiting from the serrvice recovery paradox).
  • Not sharing customer feedback, good and bad, with your support teams.
  • Not analysing/mining customer feedback for improvement opportunities.
  • Not having CSAT targets in place (and being satisfied with the status quo).

Reward your customers with improvement actions. Your responses rates will rise and you WILL see IT customer satisfaction increase.

Do you do survey your IT customers after you’ve resolved their issue or fulfilled their request? How are you using the survey results to drive improvements?

 

Leave a Reply

Your email address will not be published.