http://www.verintblog.com/index.php/tag/ken-bernhardt
Recently Oscar Alban hosted a webinar with Ken Bernhardt focused on customer surveys and net promoter scores. There were well over 250 attendees and a number of good questions resulting from this discussion. Below are three of those questions and answers for information.
Oscar: Can you give an example of obtaining a “level of effort” score; i.e., what would the question(s) be and what would the selected responses be?
Ken: The objective behind Customer Effort Score (CES) is to understand how hard it is to do business with your company. Most of the questions are based on a 5 point scale where 1 is Very Low and 5 is Very High. There are some yes/no questions as well. Ideally you would be rotating surveys so these questions would be sprinkled among those surveys. Here are some examples of the questions:
How much effort did you personally have to put forth to handle your request?
How easy or difficult was it to complete the required processes during the interaction?
How easy or difficult was it to do business with the person you spoke with (e-mailed with)?
Did you attempt to handle your request through self-service before contacting us directly?
If you tried to handle you request prior to speaking to a live person/e-mailing the company, how easy or difficult was it to use the web site?
How easy or difficult was it to reach the person able to handle your request?
Did you have to re-explain or repeat yourself when transferred?
Oscar: What have you found to be the most effective method of surveying–email survey, post call IVR survey, post call live survey, other?
Ken: From a pure response rate IVR surveys are the best. When using IVR-based surveys there is also the question of the methodology of how to deliver the survey to the customer. The first way is commonly known as a ‘blind survey’ because the agent is not involved and can’t influence the customer’s response. The reality is that they are influencing the customer. The reason is that if you use this method you have to train the agents to hang up first at the end of the interaction so that the customer can be passed to the survey system. This is contrary to how agents have typically been trained which is to have the customer hang up first in order to make sure that the customer has completed their discussion. What could happen in this situation is for the agent to mute their line and wait for the customer to think that something was wrong and hang up instead of responding to the survey. This behavior is hard to track. This method typically results in a response rate of .2% to 5%.
The other method, which is becoming more popular, is called the Warm Invitation. This is where the agent takes care of the customer’s issue first and then invites them to provide feedback. By the way, we recommend that you never use the word ‘survey’ because almost everyone has had a bad experience with a poorly designed survey. At the end of the conversation the agent simply says something to effect of, “Mr./Ms. Customer, we value your business. Would you mind taking 60 seconds to respond to 5 questions that would help us to improve our service to you?” This method typically delivers a response rate of 20% to 30%+. The good thing about this method is that you can track which agents are cherry picking only their best calls for feedback and that allows us to coach to the behavior.
Oscar: What are your thoughts on the Harvard Business Review article on Stop Trying to Delight your Customers and the suggestion of using a Customer Effort Score?
Ken: I think that ‘Delighting our Customers’ is key to keeping business going. There is a lot of competition out there that can offer what we may stop doing to satisfy our customers. However, I think we have to establish some boundaries in doing that because some customers may feel they own our praise and the effort we make to keep that customer beyond our capabilities can be used to capture new customers’ attention and interest in working with us. That effort can translate in new strategies for targeting new customers based upon the experience we record from very demanding established customers. The challenge here is how do we measure ‘Delight’? What might be a delightful interaction with one customer may be different from the next so we need to base this on something more stable that allows us to measure it and take corrective action when needed. Questions around customer satisfaction are good to include in the survey which help us to determine a certain level of delight. However level of effort questions are more pragmatic. Plus, if a customer feels that your company is just too cumbersome to do business with, it is a pretty good assumption that they are not delighted and they are at risk of leaving. So I guess my recommendation is to have questions that deal with satisfaction as well as effort.
To listen to the full webinar,
No comments:
Post a Comment
Thank you for your feedback