How do you measure customer satisfaction?

By Lovisa Lundin Customer Experience Manager

Measuring Customer Satisfaction is the best way of finding out what your customers think and feel about your service and products. But how do you actually measure it? What questions do you ask and to whom do you send the survey? Well, it depends on what you would like to measure. At Klarna, we measure both CSI (Customer Satisfaction Index) and CSAT (Customer Satisfaction). What is the difference? CSI relates to the overall satisfaction with Klarna and our products whilst CSAT relates to the experience the customer had with Klarna’s Customer Service. As CSAT is my area of expertise, I will give you a deeper insights in how we are measuring CSAT.

We send our survey to all customers who have contacted our Customer Service through phone, e-mail, chat or social media. The survey is sent two hours after the first contact. We want to send the survey as close as possible to the actual experience the customer has had with us so the memory is fresh and they can give us accurate feedback.

Our survey measures the following KPI’s (Key Performance Indicators):

  • The customer’s satisfaction (CSAT) – Measures how satisfied the customers are with the provided service by Klarna’s Customer Service
  • How the customer experienced the agent’s knowledge and treatment
  • Contact Resolution of the case (CR) – The amount of cases that were resolved. This question measures if the customers feels that the query is resolved
  • First Contact Resolution (FCR) – The amount of errands that were resolved within the first contact. We want to solve as many customer queries as possible in the first contact so customers do not have to contact us again for the same query
  • Customer effort score (CES) – The amount of effort it took customers to solve the case. We measure CES as we want to make our service as smoooth as possible for our customers

The KPI’s are crucial in order for us to understand the customer experience. We measure the KPI’s based on the following questions in the survey:

We are following the Customer Operations Performance Center’s (COPC) standard for measuring customer satisfaction. COPC uses a five-point scale with neutral as a mid-point. Our 5 point scale goes from 1-5 with 1 being “very dissatisfied” and 5 being “very satisfied”. In the survey itself we have flipped the scale (see picture below), so the scale goes from 5-1, “very satisfied” to “very dissatisfied”. We introduced a flipped scale a year ago, after finding out that the majority of the respondents replied to the survey using a phone or tablet. Our phone and tablet version has a vertical scale (see picture below) whilst the desktop version has a horizontal scale. Using the vertical scale for phone and tablet, with a scale from 1-5, “very satisfied” would be at the bottom. In worst cases some customers with older phones would need to scroll in order to rate us with “very satisfied”. To make it easier for our customers to rate us with very satisfied, we have thus flipped the scale. Customers who are dissatisfied, will always find their way to rate us accordingly. From the five-point scale we are as well following COPC standards and measuring satisfaction as 5’s and 4’s whilst dissatisfaction is calculated by 1’s. The benchmark and best practice target set by COPC is 85% customer satisfaction (CSAT) and 5% customer dissatisfaction (DSAT). This means that at least 85% of the responses to your survey should be either a 4 or a 5, and less than 5% should be a 1.  We have set the same 85% targets for the following KPI’s; Contact Resolution (CR), FIrst Contact Resolution (FCR), Agent Knowledge & Treatment and Customer Effort Scores (CES).

Our scale either go from “very satisfied” to “very dissatisfied” or “strongly agree” to “strongly disagree”. For closed questions like Contact Resolution, we have “Yes” or “No”-option instead of an answer scale.


Our CSAT survey has been live for years which has given us the opportunity to A/B-test wordings, tonality as well as questions in order to make sure that we are not biasing our customers and that we are actually asking questions we want answers on. We have changed questions, tonality and scales many times until we have found what is working best for us. For example, we recently got rid of the answer “I don’t know” (counted as neutral) from our answer scale for the question “Do you think your query is resolved?”. We want to know the truth, it was either solved or not. A simple ”Yes” or ”no”-question therefore suffices. We do not want “I don’t know” responses as this tells us nothing. Don’t know about what? You either know, or you don’t. An “I don’t know”-answer is from our experience, most likely a “No”-answer as you would for sure know when your case is solved or if you received the help you needed.

Furthermore, you should never ask questions that you will not use for analysis or ask questions you already have the answer to. “How did you contact us?”, ”What did you purchase?” or “Did you order online, in store or by phone?” are typical questions asked in surveys where you already should have the answers to those questions in your data. All those extra questions that are not bringing any value to your business are just leading to survey fatigue and could potentially increase your drop out rate.  So do yourself and your customers a favor and please do not bore anyone with questions you already know the answers to. You should honor the customers’ time and effort of taking your survey in the first place by providing a short survey including the most necessary questions.

So to summarize, how do you measure Customer Satisfaction?

  • Make sure you are measuring the most important KPI’s for your business by formulating questions you really need answers to
  • Follow a standard for your survey measurements (COPC or ACSI) to easier find targets and best practices that are in line with the industry
  • A/B test, rephrase, change the scale until you find a setup that is working best for you and is telling you the truth about the customers satisfaction


Until next time…

Lovisa Lundin