How to analyze survey data: best practices for actionable insights from survey analysis
Collected all of your survey data? Great. Confused about what to do next and how to achieve the optimal survey analysis? Don’t be.
If you’ve ever stared at an Excel sheet filled with thousands of rows of survey data and not known what to do, you’re not alone. Use this post as a guide to lead the way to execute best practice survey analysis in 2019.
Customer surveys can have a huge impact on your organization. Whether that impact is positive or negative depends on how good your survey is (no pressure). Has your survey been designed soundly ? Does your survey analysis deliver clear, actionable insights? And do you present your results to the right decision makers? If the answer to all those questions is yes, only then new opportunities and innovative strategies can be created.
What is survey analysis?
Survey analysis refers to the process of analyzing your results from customer (and other) surveys. This can, for example, be Net Promoter Score surveys that you send a few times a year to your customers.
Why do you need for best in class survey analysis?
Data on its own means nothing without proper analysis. Thus, you need to make sure your survey analysis produces meaningful results that help make decisions that ultimately improve your business.
There are multiple ways of doing this, both manual and through software, which we’ll get to later.
Types of survey data
Data exists as numerical and text data, but for the purpose of this post, we will focus on text responses here.
Closed-ended questions can be answered by a simple one-word answer, such as “yes” or “no”. They often consist of pre-populated answers for the respondent to choose from; while an open-ended question asks the respondent to provide feedback in their own words.
Closed-ended questions come in many forms such as multiple choice, drop down and ranking questions.
In this case, they don’t allow the respondent to provide original or spontaneous answers but only choose from a list of pre-selected options. Closed-ended questions are the equivalent of being offered milk or orange juice to drink instead of being asked: “What would you like to drink?”
These types of questions are designed to create data that are easily quantifiable, and easy to code, so they’re final in their nature. They also allow researchers to categorize respondents into groups based on the options they have selected.
An open-ended question is the opposite of a closed-ended question. It’s designed to produce a meaningful answer and create rich, qualitative data using the subject’s own knowledge and feelings.
Open-ended questions often begin with words such as “Why” and “How”, or sentences such as “Tell me about…”. Open-ended questions also tend to be more objective and less leading than closed-ended questions.
How to analyze survey data
How do you find meaningful answers and insights in survey responses?
To improve your survey analysis, use the following 5 steps:
- Start with the end in mind – what are your top research questions?
- Filter results by cross-tabulating subgroups
- Interrogate the data
- Analyze your results
- Draw conclusions
1. Check off your top research questions
Go back to your main research questions which you outlined before you started your survey. Don’t have any? You should have set some out when you set a goal for your survey. (More on survey planning below).
A top research question for a business conference could be: “How did the attendees rate the conference overall?”.
The percentages in this example show how many respondents answered a particular way, or rather, how many people gave each answer as a proportion of the number of people who answered the question.
Thus, 60% or your respondents (1098 of those surveyed) are planning to return. This is the majority of people, even though almost a third are not planning to come back. Maybe there’s something you can do to convince the 11% who are not sure yet!
2. Filter results by cross-tabulating subgroups
At the start of your survey, you will have set up goals for what you wanted to achieve and exactly which subgroups you wanted to analyze and compare against each other.
This is the time to go back to those and check how they (for example the subgroups; enterprises, small businesses, self-employed) answered, with regards to attending again next year.
For this, you can cross-tabulate, and show the answers per question for each subgroup.
Here, you can see that most of the enterprises and the self-employed must have liked the conference as they’re wanting to come back, but you might have missed the mark with the small businesses.
By looking at other questions and interrogating the data further, you can hopefully figure out why and address this, so you have more of the small businesses coming back next year.
You can also filter your results based on specific types of respondents, or subgroups. So just look at how one subgroup (women, men) answered the question without comparing.
Then you apply the cross tab to look at different attendees to look at female enterprise attendees, female self-employed attendees etc. Just remember that your sample size will be smaller every time you slice the data this way, so check that you still have a valid enough sample size.
3. Interrogate the data
Look at your survey questions and really interrogate them. The following are some questions we use for this:
- What are the most common responses to questions X?
- Which responses are affecting/impacting us the most?
- What’s different about this month/this year?
- What did respondents in group Y say?
- Which group of respondents are most affected by issue Z?
- Have customers noticed our efforts in solving issue Z?
- What do people say about Z?
For example, look at question 1 and 2. The difference between the two is that the first one returns the volume, whereas in the second one we can look at the volume relating to a particular satisfaction score. If something is very common, it may not affect the score. But if, for example, your Detractors in an NPS survey mention something a lot, that particular theme will be affecting the score in a negative way. These two questions are important to take hand in hand.
You can also compare different slices of the data, such as two different time periods, or two groups of respondents. Or, look at a particular issue or a theme, and ask questions such as “have customers noticed our efforts in solving a particular issue?”, if you’re conducting a continuous survey over multiple months or years.
For tips on how to analyze results, see below. This is a whole topic in itself, and here are our best tips. For best practice on how to draw conclusions you can find in our post How to get meaningful, actionable insights from customer feedback.
4 best practices for analyzing survey data
Make sure you incorporate these tips in your analysis, to ensure your survey results are successful.
1. Ensure sample size is sufficient
To always make sure you have a sufficient sample size, consider how many people you need to survey in order to get an accurate result.
You most often will not be able to, and shouldn’t for practicality reasons, collect data from all of the people you want to speak to. So you’d take a sample (or subset) of the people of interest and learn what we can from that sample.
Clearly, if you are working with a larger sample size, your results will be more reliable as they will often be more precise. A larger sample size does often equate to needing a bigger budget though.
The way to get around this issue is to perform a sample size calculation before starting a survey. Then, you can have a large enough sample size to draw meaningful conclusions, without wasting time and money on sampling more than you really need.
Consider how much margin of error you’re comfortable working with first, as your sample size is always an estimate of how the overall population think and behave.
2. Statistical significance – and why it matters
How do you know you can “trust” your survey analysis ie. that you can use the answers with confidence as a basis for your decision making? In this regard, the “significant” in statistical significance refers to how accurate your data is. Or rather, that your results are not based on pure chance, but that they are in fact, representative of a sample. If your data has statistical significance, it means that to a large extent, the survey results are meaningful.
It also shows that your respondents “look like” the total population of people about whom you want to draw conclusions.
3. Focus on your insights, not the data
When presenting to your stakeholders, it’s imperative to highlight the insights derived from your data, rather than the data itself.
You’ll do yourself a disservice. Don’t even present the information from the data. Don’t wait for your team to create insights out of the data, you’ll get a better response and better feedback if you are the one that demonstrates the insights to begin with, as it goes beyond just sharing percentages and data breakouts.
4. Complement with other types of data
Don’t stop at the survey data alone. When presenting your insights, to your stakeholders or board, it’s always helpful to use different data points and which might include even personal experiences. If you have personal experience with the topic, use it! If you have qualitative research that supports the data, use it!
So, if you can overlap qualitative research findings with your quantitative data, do so.
Just be sure to let your audience know when you are showing them findings from statistically significant research and when it comes from a different source.
3 ways to code open-ended responses
When you analyze open-ended responses, you need to code them. Coding open-ended questions have 3 approaches, here’s a taster:
- Manual coding by someone internally. If you receive 100-200 responses per month, this is absolutely doable. The big disadvantage here is that there is a high likelihood that whoever codes your text will apply their own biases and simply not notice particular themes, because they subconsciously don’t think it’s important to monitor.
- Outsource to an agency. You can email the results and they would simply send back coded responses.
- Automating the coding. You use an algorithm to simulate the work of a professional human coder.
Whichever way you code text, you want to determine which category a comment falls under. In the below example, any comment about friends and family both fall into the second category. Then, you can easily visualize it as a bar chart.
Code frames can also be combined with a sentiment.
Below, we’re inserting the positive and the negative layer under customer service theme.
So, next, you apply this code frame. Below are snippets from a manual coding job commissioned to an agency.
In the first snippet, there’s a code frame. Under code 1, they code “Applied courses”, and under code “2 Degree in English”. In the second snippet, you can see the actual coded data, where each comment has up to 5 codes from the above code frame. You can imagine that it’s actually quite difficult to analyze data presented in this way in Excel, but it’s much easier to do it using software.
The best tools for survey analysis
Traditional survey analysis is highly manual, error-prone, and subject to human bias. You may think of this as the most economical solution, but in the long run, it often ends up costing you more (due to time it takes to set up and analyze, human resource, and any errors or bias which result in inaccurate data analysis, leading to faulty interpretation of the data. So, the question is:
Do you need software?
When you’re dealing with large amounts of data, it is impossible to manage it all properly manually. Either because there’s simply too much of it or if you’re looking to avoid any bias, or if it’s a long-term study, for example. Then, there is no other option but to use software”
On a large scale, software is ideal for analyzing survey results as you can automate the process by analyzing large amounts of data simultaneously. Plus, software has the added benefit of additional tools that add value.
Below we give just a few examples of types of software you could use to analyze survey data. Of course, these are just a few examples to illustrate the types of functions you could employ.
1. Thematic software
As an example, with Thematic’s software solution you can identify trends in sentiment and particular themes. Bias is also avoided as it is a software tool, and it doesn’t over-emphasize or ignore specific comments to come to unquantified conclusions.
Below is an example we’ve taken from the tool, to visualize some of Thematic’s features.
Our visualizations tools show far more detail than word clouds, which are more typically used.
You can see two different slices of data. The blue bars are United Airlines 1 and 2-star reviews, and the orange bars are the 4 and 5-star reviews. It’s a fantastic airline, but you can identify the biggest issue as mentioned most frequently by 1-2 stars reviews, which is their flight delays. But the 4 and 5-star reviews have frequent praise for the friendliness of the airline.
You can find more features, such as Thematic’s Impact tool, Comparison, Dashboard and Themes Editor here.
If you’re a DIY analyzer, there’s quite a bit you can do in Excel. Clearly, you do not have the sophisticated features of an online software tool, but for simple tasks, it does the trick. You can count different types of feedback (responses) in the survey, calculate percentages of the different responses survey and generate a survey report with the calculated results. For a technical overview, see this article.
You can also build your own text analytics solution, and rather fast.
How to build a Text Analytics solution in 10 minutes
The following is an excerpt from a blog written by Alyona Medelyan, PhD in Natural Language Processing & Machine Learning.
As she mentions, you can type in a formula, like this one, in Excel to categorize comments into “Billing”, “Pricing” and “Ease of use”:
It can take less than 10 minutes to create this, and the result is so encouraging!
Everyone loves simplicity. But in this case, simplicity sucks
Various issues can easily crop up with this approach, see the image below:
Out of 7 comments, here only 3 were categorized correctly. “Billing” is actually about “Price”, and three other comments missed additional themes. Would you bet your customer insights on something that’s at best 50 accurate?
Developed by QRS International, Nvivo is a tool where you can store, organize, categorize and analyze your data and also create visualisations. Nvivo lets you store and sort data within the platform, automatically sort sentiment, themes and attribute, and exchange data with SPSS for further statistical analysis. There’s a transcription tool for quick transcription of voice data.
It’s a no-frills online tool, great for academics and researchers.
Interpris is another tool from QRS International, where you can import and store free text data directly from platforms such as Survey Monkey and store all your data in one place. It has numerous features, for example automatically detecting and categorizing themes.
Favoured by government agencies and communities, it’s good for employee engagement, public opinion and community engagement surveys.
Other tools worth mentioning (for survey analysis but not open-ended questions) are SurveyMonkey, Tableau and DataCracker.
There are numerous tools on the market, and they all have different features and benefits. Choosing a tool that is right for you will depend on your needs, the amount of data and the time you have for your project and, of course, budget. The important part to get right is to choose a tool that is reliable and provides you with quick and easy analysis, and flexible enough to adapt to your needs.
An idea is to check the list of existing clients of the product, which is often listed on their website. Crucially, you’ll want to test the tool, or at the least, get a demo from the sales team, ideally using your own data so that you can use the time to gather new insights.
A few tips on survey design
Good surveys start with smart survey design. Firstly, you need to plan for survey design success. Here are a few tips:
Our 9 top tips for survey design planning
1. Keep it short
Only include questions that you are actually going to use. You might think there are lots of questions that seem useful, but they can actually negatively affect your survey results. Another reason is that often we ask redundant questions that don’t contribute to the main problem we want to solve. The survey can be as short as three questions.
2. Use open-ended questions first
To avoid enforcing your own assumptions, use open-ended questions first. Often, we start with a few checkboxes or lists, which can be intimidating for survey respondents. An open-ended question feels more inviting and warmer – it makes people feel like you want to hear what they want to say and actually start a conversation. Open-ended questions give you more insightful answers, however, closed questions are easier to respond to, easier to analyze, but they do not create rich insights.
The best approach is to use a mix of both types of questions, as It’s more compelling to answer different types of questions for respondents.
3. Use surveys as a way to present solutions
Your surveys will reveal what areas in your business need extra support or what creates bottlenecks in your service. Use your surveys as a way of presenting solutions to your audience and getting direct feedback on those solutions in a more consultative way.
4. Consider your timing
It’s important to think about the timing of your survey. Take into account when your audience is most likely to respond to your survey and give them the opportunity to do it at their leisure, at the time that suits them.
5. Challenge your assumptions
It’s crucial to challenge your assumptions, as it’s very tempting to make assumptions about why things are the way they are. There is usually more than meets the eye about a person’s preferences and background which can affect the scenario.
6. Have multiple survey-writers
To have multiple survey writer can be helpful, as having people read each other’s work and test the questions helps address the fact that most questions can be interpreted in more than one way.
7. Choose your survey questions carefully
When you’re choosing your survey questions, make it really count. Only use those that can make a difference to your end outcomes.
8. Be prepared to report back results and take action
As a respondent you want to know your responses count, are reviewed and are making a difference. As an incentive, you can share the results with the participants, in the form of a benchmark, or a measurement that you then report to the participants.
9. What’s in it for them?
Always think about what customers (or survey respondents) want and what’s in it for them. Many businesses don’t actually think about this when they send out their surveys.
If you can nail the “what’s in it for me”, you automatically solve many of the possible issues for the survey, such as whether the respondents have enough incentive or not, or if the survey is consistent enough.
For a good survey design, always ask:
- What insight am I hoping to get from this question?
- Is it likely to provide useful answers?
For more pointers on how to design your survey for success, check out our blog on 4 Steps to Customer Survey Design – Everything You Need to Know.