You can view if there's a statistically significant difference between how certain response groups answered the questions in your survey. To use the statistical significance feature in SurveyMonkey:
- Turn on statistical significance while adding a Compare Rule to a question in your survey. Choose the groups you want to compare to break down your survey results by group in a side-by-side comparison.
- Examine the data tables for the questions in your survey to see if there are statistically significant differences in how different groups answered the survey.
Viewing Statistical Significance
The following steps will guide you through creating a survey that can display statistical significance.
To display statistical significance when analyzing results, you'll need to apply a Compare Rule to a question in your survey.
You must use one of the following question types in your survey design to be able to apply a Compare Rule and calculate statistical significance:
Make sure your answer options break down into meaningful groups. The answer options you choose to compare when creating the Compare Rule will be used to cross-tabulate your data across the rest of the survey.
After you finish designing your survey, create a collector to distribute it. There are several ways to send your survey.
You need at least 30 responses per answer option you plan to use in your Compare Rule in order to turn on and view statistical significance.
To turn on statistical significance, you need to add a Compare Rule to a question in your survey to divide respondents into groups.
To apply a Compare Rule:
- Go to the Analyze Results section of your survey.
- Click +COMPARE in the Current View section of the left sidebar.
- Click Compare by Question and Answer, unless you used an A/B test in your survey.
- Use the drop-down menu to find the desired question.
- Select the answer options you'd like for your groups. These groups will be used to cross-tabulate your data across the rest of the survey.
- Click the toggle next to Show statistical significance to turn them on. At least 2 of the answer options need to have 30 responses or more in each group order to turn on statistical significance.
- Click Apply.
To edit your groups after you create the rule, click the down arrow to the right of the rule in the left sidebar and click Edit rule.
After you apply the Compare Rule, you'll see how each group answered each question in a side-by-side view so you can easily spot similarities and differences.
We highlight answer options that are statistically significant in the data tables below question charts.
- Statistical significance is calculated for the following question types: Multiple Choice, Dropdown, Matrix/Rating Scale, and Ranking.
- Each row in the data table represents a response group. Each response group is also assigned a letter, which you can reference in the data table.
- Each column in the data table represents an answer choice.
Hover over any answer option to see additional information. You'll see one of the following messages:
|Significantly lower than||The group is significantly less likely to select the answer option compared to other highlighted groups.|
|Significantly higher than||The group is significantly more likely to select the answer option compared to other highlighted groups.|
|More responses needed||You need at least 30 response to calculate statistical significance for the group.|
|Combined answer choices||At this time, we can't calculate statistical significance for combined answer choices. Please uncombine them.|
|Hidden answer choices||At this time, we can't calculate statistical significance for hidden answer choices. Please unhide them.|
|No significant difference||The group selected this answer choice about as often as other groups.|
Scroll down the page to the section on "What is a statistically significant difference?" for more information.
Shared Data Pages
If you share your survey results online, people who view your shared data page can view statistical significance.
To create a shared data page that includes statistical significance:
- Under Current View in the left sidebar, click the Compare Rule you created with statistical significance.
- Under Saved Views in the left sidebar, click +Save as...
- Enter a name for the view and click Save.
- Click + Share All in the upper-right corner of the page.
- Finish setting up the shared data page.
Statistical significance is included in Summary Data PDF and PPT exports. Your export notes responses that are significantly different with the letter of the response group they are significantly different from.
To export results that include statistical significance, apply your Compare Rule and export the Current View.
You want to see if men are significantly more satisfied with your product than women.
- Add two multiple-choice questions to your survey:
• What is your gender? (Male, Female)
• How satisfied or dissatisfied are you with our product? (Satisfied, Dissatisfied)
- Make sure at least 30 respondents select male as their gender AND at least 30 respondents select female as their gender.
- Add a Compare Rule to the question "What is your gender?" and select both the male and female answer options as your groups.
- Use the data table below the question chart for "How satisfied or dissatisfied are you with our product?" to see if any answer options show a statistically significant difference.
What is a statistically significant difference?
A statistically significant difference tells you whether one group's answers are substantially different from another group's answers by using statistical testing. Statistical significance means that the numbers are reliably different, greatly aiding your data analysis. Still, you should also consider whether the results are important — it's up to you to decide how to interpret or take action on your results.
For example, say you receive more customer complaints from women than you do from men. How do you know this is a real difference you should address? One great way is to run a survey to see if your male customers are a lot more satisfied with your product. Using a statistical formula, our statistical significance feature can help you determine if men have significantly higher levels of satisfaction with your product than women. This allows you to take action on data, not on anecdote.
If your results are highlighted in your data table, that means that the two groups are significantly different from one another. Significance means that the numbers are statistically different — it doesn't mean the finding is important or meaningful.
If your results are not highlighted in your data table, even though the percentages being compared may be different, the two numbers are not statistically different.
Responses with no statistically significant difference show that the two items being compared are not significantly different at your sample size, but does not necessarily mean that they are unimportant. You may be able to detect a statistically significant difference by increasing your sample size.
If you have a very small sample size, only large differences between two groups will be significant. If you have a very large sample size, both small and large differences will be detected as significant.
However, if two numbers are statistically different, it doesn't mean that the results are meaningfully different. You'll have to decide which differences are meaningful to your survey goal.
Calculating Statistical Significance
We calculate statistical significance using a standard 95% confidence level. When we display an answer option as statistically significant, it means the difference between two groups has less than a 5% probability of occurring by chance or sampling error alone, which is often displayed as p < 0.05.
To calculate the statistical significance between groups, we use the following formulas:
|a1||The proportion of the first group answering a question a certain way multiplied by the sample size of that group.|
|b1||The proportion of the second group answering a question a certain way multiplied by the sample size of that group.|
|Pooled Sample Proportion (p)||The combination of the two proportions for both groups.|
|Standard Error (SE)||A measure of how far your proportion is from the true proportion. A smaller number means the proportion is close to the true proportion, a larger number means the proportion is far away from the true proportion.|
|Test Statistic (t)||A t-statistic. The number of standard deviations a number is away from the mean.|
|Statistical Significance||If the absolute value of the test statistic is greater than 1.96* standard deviations of the mean, then it's considered a statistically significant difference.|
*1.96 is a number used for the 95% confidence level since 95% of the area under a student's t-distribution function lies within 1.96 standard deviations of the mean.
Continuing the example from above, let's find out if the percent of men who say they are satisfied with your product is significantly more than the percent of women.
Let's say you surveyed 1000 men and 1000 women and found 70% of men say they are satisfied with your product compared to 65% of women. Is 70% significantly higher than 65%?
Use the following survey data to complete the formulas:
- p1 (% of men who are satisfied with the product) = 0.7
- p2 (% of women who are satisfied with the product) = 0.65
- n1 (# of men you surveyed) = 1000
- n2 (# of women you surveyed) = 1000
Since the absolute value of the test statistic is greater than 1.96, it means the difference between men and women is significant. Men are more likely to be satisfied with your product than women.
Hiding Statistical Significance
To hide statistical significance for all questions:
- Click the down arrow to the right of the compare rule in the left sidebar.
- Click Edit rule.
- Click the toggle next to Show statistical significance to turn them off.
- Click Apply.
To hide statistical significance for one question:
- Click Customize above the question chart.
- Click the Display Options tab.
- Uncheck the box next to Statistical significance.
- Click Save.
The swap rows and columns display option is automatically turned on when you show statistical significance. If you uncheck this display option, statistical significance is turned off as well.