Have you ever run an A/B test? If not, I’ll explain why you should and how to go about it to increase conversions.
An Easy Way to Understand A/B Testing
Let’s say you go into a grocery store and someone offers you jam. They do the same for thousands of other customers. Half the people are offered strawberry jam. The other half might get marmalade. If more people buy strawberry, the store can say with a reasonable amount of confidence that strawberry's the flavor that most people prefer.
A/B testing is exactly the same—just conducted online. You're trying to find out whether people are coming to your website because of a specific page or whether a specific page is meeting its expectations. The home page is usually A/B tested most. Is this version of your home page better than another? Will adding a video to your home page generate more leads and more sales?
A/B testing is very powerful because you can test literally anything, including copy, images, messaging, videos, etc. You can tweak everything to get as much engagement as possible.
Tools for A/B Testing
First off, there really isn’t a point to A/B testing when you don’t have a lot of traffic. This is a requirement for statistical significance—your sample size needs to be large enough for you to conduct a test with a reasonable degree of confidence in its results.
So let’s assume your website already has quite a few visitors per month—in the thousands. Let's use a baseline of 10,000 visits per month. That's a good point to start doing some A/B testing on different elements of the site.
- What you might want to do initially is use a tool like Qualaroo.
- You might use a tool like HotJar just to see how people are behaving on the site.
- You can use a tool like Crazy Egg that will show heat maps.
- You can run surveys with SurveyMonkey or YesInsights, which is basically is a one-click survey where you can email people with a high response rate (I’ve seen 9x response rates from YesInsights compared to SurveyMonkey).
- There are tools like OptimizeIt, Visual Website Optimizer, and Google Content Experiments that help you run the A/B tests. Tools like Crazy Egg or VWO have a WYSIWYG editor so you can adjust elements within your site without needing a developer.
Running A/B Tests in “Legs”
The key is that when you're doing these tests, you need volume. If you're getting under a thousand sales or conversions on your web site in a given month, you shouldn't have more than two versions.
Typically when you're running tests, you need to run them in “legs.” What I mean by that is, if you start on a Monday, for example, you finish on a Sunday. But you don't want to do a test Monday through Wednesday; you want to make sure you're testing a full week. And a lot of times I like doing it for two weeks, three weeks, four weeks straight.
The point is, you want to make sure, within your test, that you have each day of the week in there. Why? Because people interact with a website on Sundays differently than they do on Mondays. And a lot of the people who are on the website on Sundays are a different demographic than the people who are on the site on Mondays.
For example, Sunday is a work day in Israel or Dubai, right? Their version of a Monday is actually Sunday, so their work days are from Sunday through Thursday. Friday and Saturday are off.
If you want to get a full sample, make sure you're running tests for at least a week and don't end the test until the tool tells you it's statistically significant.
Related Content: 13 Conversion Rate Optimization Case Studies
Do You Need to Get Up to 99% Statistical Significance?
The short answer is no. While getting as close to 100% is ideal, it takes way more effort and time than it may be worth. It really depends on how high the stakes are. Neil often stops at 97%. Statisticians will tell you that by default 95% is already super reliable.
I remember running a bunch of tests at the startup I was at previously and in general, we tried to always get to 99% statistical significance. Now, the issue with that is, that can take a long time to get there and it might not be worth waiting that long.
But the main thing that we're looking at is, once again, our sample size. When you only have 30 conversions on each variation, even if it's 99% statistically significant and one version had 30 conversions while the other one only had five conversions, this could mean very little. You’re just starting out your A/B test. The numbers could look totally different by the end of your run.
Related Content: 5 Important Landing Page Elements You Should Be A/B Testing
How Useful Is Multivariate Testing?
Another way to do testing is called multivariate testing, where you're running ABCD versions (or 3+ versions rather than two). In general, a lot of people don't talk about multivariate testing and stick with A/B testing because the more variables you have, the more false positive and false negatives you introduce into the equation.
Even when you try to play it safe, multivariate testing can lead to odd results.
For example, we once ran a control page (A), tested the headline (B), the color of the text (C), and some images (D). The tool we were using told us that one headline won by 10%, one color won by 30% and one image won by another 10%. But that’s when they’re individual changes. When you combine all three together, the percentage may be totally different.
How to Identify Which Pages to A/B Test
Ideally, you want over 1,000 conversions per month before you start running A/B tests. But even if you have a small amount, you can still run tests, and the way we like to run them is by looking in Google Analytics and seeing where people are dropping off the most.
For example, let’s say 1,000 people go to your homepage and only 900 move on to your pricing page. And out of the 900 that made it to your pricing page, only 50 people go to your checkout page. This shows you that there are big issues on your pricing page versus your home page, because the drop off is so much steeper. So you should try to optimize the pricing page first.
There are a lot of paid tools out there, but Google Analytics is a great place to start because you can see the funnel drop off if you have goals set up correctly. Also, you can see the behavior flow of where people are landing on and what pages they're going to next and how people are converting in general.
Some of My Favorite Conversion Blogs
One of my favorite blogs for conversion experiments is Conversion Rate Experts. There are a lot of great case studies out there. They were kind of the forefront of CRO and they were the first conversion rate blog that I found.
ConversionXL‘s another amazing blog. All they do is talk about conversion tests and data, even offline versions that people are running. They have great case studies that you can learn from as well.
This post was adapted from Marketing School, a 10-minute daily podcast in which Neil Patel and Eric Siu teach you real-life marketing strategies and tactics from their own experience to help you find success in any marketing capacity. Listen to the podcast version of this post below: