One of the things I’ve noticed is that server companies, when sending the original email out requesting customer feedback on their response to an issue, have cut the number of questions to…
It didn’t used to be that way. I can remember a time when receiving a request for customer feedback for their response to a communicated server issue that I would open up a survey that might have gone on for 3 pages, asking for input on how well the call was answered in the first place, how quickly they resolved the issue, whether I’m satisfied with the resolution, would I tell my friends about the resolution, etc.
All good questions. Way too many, though. I only filled them out if I was annoyed by the resolution because I wanted to get my 2 cents in.
I’m sure I’m not in the minority in how I respond to customer incident feedback surveys – online or offline. The incentive to complete it when ticked off is much higher than when completely satisfied. If you want an accurate reading of both, you can’t ask too many questions.
This lesson applies to just about any organization. The email basically asks the customer:
How would you rate the support you received?
- Good, I’m satisfied
- Bad, I’m unsatisfied
If the answer is Bad, then it’s most likely time for human intervention.
Keeping the inquiry short and simple provides a higher likelihood of a response, and in this particular situation response rate is much more important than depth of response. We can all learn from the server companies – they have plenty of experience with this.