Which option is best? When it comes to websites, there’s one way to know for sure—a test.
We’re huge fans of making decisions based on data, which is why we love A/B testing. Google Analytics makes it easy to perform tests using Google Experiments.
We see the need for tests pop up in all sorts of situations:
- There are two viable options and your team isn’t sure which will work better
- Your website isn’t converting like you’d hoped it would
- New usability research comes out
What Are You Testing?
First, your website analytics need to be set up correctly to track your goals. Without that, how will you determine a winner?
Need help making sure your analytics and goals are set up correctly?
You can A/B test all sorts of website elements:
- Call to action
- Page layout
- And more
You can test using a hypothesis—for example, “A form with fewer fields will have a higher conversion rate.” You can test without a hypothesis, but you have to understand why you’re performing the test—for example, “Which homepage banner copy gets a higher engagement rate?”
While it’s called A/B testing, you don’t just have to test two options. You can test anywhere between 2-5 variations in a single test. We recommend sticking to 2-3.
How to Set Up the Experiment
Let’s walk through an example of testing how we can increase conversion rates on our resources.
One recommendation to keep in mind: Only set up the minimum of what you need to perform the test. Then invest the time to build out full functionality on the winner. In our example, we’re testing on just one resource, and then we’ll apply the winner to all of our resources.
Step 1: Create all variations of your test.
Our resource variations are:
- The way we originally set up the page, with a 3-field form at the bottom of the page
- A page with an above-the-fold form with 2 fields and a “we won’t spam you” disclaimer
- A page with no visible form, just a download button above the fold that links to a 2-field popup form that has an even more powerful “we won’t spam you” statement
Friendly reminder: Fully test all variations for proper functionality on all supported browsers and mobile devices.
Step 2: Navigate to the experiments area in Google Analytics.
Step 3: Create your experiment in Google Analytics.
Click “Create experiment.”
Choose your settings.
- Name – Only you will see this, so choose something that’s descriptive.
- Objective – Choose which goal (see, we told you you’d need these set up) to track.
- Percentage of Traffic – You can test all your traffic, which is the default, or choose a percentage segment. The warning you see in the screenshot is from the filter we have applied to block our internal traffic from tracking in Google Analytics.
- Email Notifications – This is pretty self-explanatory.
Those are the basic settings that have to be set up. Below are advanced settings that are optional.
- Distribute Traffic – When this setting is off, Google Analytics will automatically adjust the percentage of traffic to drive more users to the more successful pages. When this setting is on, each test page will get an equal percentage of the experiment traffic. As usual, the decision whether to use this option depends on the goals of your test.
- Experiment Timeframe – Depending on a lot of factors, you may want your experiment to be quick (3 days) or more spread out (2 weeks). This is only a minimum. Google Analytics uses algorithms to decide when a clear winner is determined. They say, “Things like low traffic volume or periodic traffic patterns can require up to three months for Analytics to declare a winner.”
- Confidence Threshold – How sure do you want to be of the winner? The higher the threshold, the longer the test will take, but the more sure you’ll be.
Configure the experiment.
Indicate your original page, and any variations you have. The “Original Page” field isn’t necessarily well named because you can test pages that didn’t exist previously. Google Analytics uses the original page to indicate probability of the variations performing better than this page.
For our test, we did have an original page—the page we originally created when we set up our website. Then we created and added our two variations.
Set up the experiment code.
Once you have your experiment set up, Google Analytics will provide a code snippet. If you have a CMS, like Sitefinity, that allows you to insert code snippets yourself, have at it!
If your CMS doesn’t allow you to add scripts, depending on your technical prowess, you may have to send the snippet to a developer (or your web company) to install.
Review and start.
Google will run a quick validation to make sure your code is installed correctly. If everything looks good, go ahead and start the experiment.
Winner, Winner, Chicken Dinner
Let the test run at least for your minimum time frame. You can check on it, but don’t make changes to your pages while the experiment is running.
Google Analytics will show you a variety of stats for the experiment. The screenshot below is for our example experiment above that’s been running for a few days.
Edit your experiment’s settings. Once the experiment starts, only do this if you absolutely have to.
View different analytics for the pages involved in the test. In addition to our primary goal (which we laid out in the experiment settings), we find the “Site Usage” stats useful in seeing the big picture—new sessions, session duration, bounce rate, and pages per session.
The default line chart will be your primary goal, however you can view the additional “Site Usage” analytics mentioned above as well as your other site goals.
For each test page, view the number of sessions, the number of conversions, the conversion rate, how it’s performing compared to your original, and how likely it is the variation will outperform your original test page.
In our experiment it looks like there’s a clear winner, right? We’ll just have to wait and see. We’re only a week into our minimum 2-week experiment, so we have to be patient.
Try for Yourself!
Google Analytics Experiments are a quick, powerful way to get insight into usability on your website. If you need help thinking through an experiment, shoot me an email.