A/B Test Calculator – Statistical Significance Calculator
Find out which variation of your website will help you to reach your goal
Run an A/B Test in 4 Steps
How to Run an A/B test
To run an A/B test you need to compare two pages (page A, and page B) with a different single element on each. Page A is the original page – this is known as the control. Page B should be identical to page A, but with one small modification.
The best elements to modify are those that drive conversions and call your visitors to action. You should only change ONE element on the page at a time in order to have a statistically significant test result. If you change multiple elements, you won’t know which change worked, and which didn’t.
Common elements to change include:
- Call to action button
- Ad copy
- Product copy
Within all of these elements you could also consider changing:
The page that gives higher conversions is the winning variant. The more you optimize your page, the higher your conversion rate will be.
Once your pages are ready, half of your traffic should divert to the control page (page A), and the other half should be routed to the modified page (page B).
What is an A/B Confidence Score?
The confidence score is a way of measuring the reliability of an estimate. If the confidence score is high, it means that the results are statistically significant, and you can be confident that these results are a consequence of the changes you made, and are not just a result of random chance.
|95-100%||Your A/B test is statistically significant, congratulations! You should implement the winning variant.|
|90-95%||Your A/B test is unlikely to be statistically significant. You could cautiously implement the winning variant, but it would be safer to try another A/B test first.|
|<90%||Your A/B test is not statistically significant. Do not implement the winning variant.|