How To: Using Optimizely to Run A/B Split Tests
In our report, “Inside the Cave,” on the Obama campaign’s digital, tech, and analytics teams, we highlighted how they used optimization and A/B split testing to improve their online donation page. By conducting 240 A/B tests, Obama for America was able to increase the conversion rate on their donation page by 49%.
And while most of us don’t have access to a full-time analytics staff, A/B testing is within reach for any organization of any size. In fact, the Obama campaign used Optimizely, a web-based A/B testing platform, that you can leverage for your organization for as little as $19 per month depending on your site traffic.
What is A/B Testing?
In reference to websites, an A/B test compares two variables of an element — color, text, or placement — to determine whether or not a change in that element improves a desired outcome.
Here’s a quick guide to get started with A/B testing on your site.
Once you’ve set up your Optimizely account, you’ll need to provide your web developer or vendor with a snippet of code that will allow you to make edits to your site from Optimizely. That’s the last you’ll need to deal with code for optimizely.
Determine what you want to optimize
The key to worthwhile optimization is having a measurable outcome (like a donation) and a clearly defined goal (more donations). For organizations that are new to A/B testing, increasing email capture and donations are two of the most attainable goals for A/B testing.
Outline your experiment
Now that you know what you want to improve, identify elements on your site that you can tweak to improve the response. For example, if you’re optimizing for email signups, you could test the submit button text. Does changing “submit” on the button to “join” increase your conversion rate? Do more people donate if the button is red or blue? In order to get actionable insights from your test, you should find apples to apples comparisons.
Good A/B Test
Red Button vs Blue Button.
This test will reveal any difference button color has on conversions.
Good A/B Test
“Join” vs “Submit”.
This test will reveal how changing the text on the submit button will affect signups.
Bad A/B Test
Red “Join” button vs Blue “submit button.
After this test, you won’t be able to isolate the source of any change in performance.
Implement and run your experiment
Using Optimizely’s editor, you can easily tweak design and textual elements on your site and get the experiment up and running. The more traffic your site gets, the more data you’ll receive, and the better your A/B test will perform. You want a statistically significant outcome to ensure that the change in outcome you’re seeing is actually the result of the change in design you made.
While Optimizely will let you run multiple variations, for organizations that might not see much web traffic, it’s best to conduct one test at a time to zero in on the best design for your page.
Avoid the Frankenstein Effect
You should always make sure you test every combination before settling on a final design change. Photo A may outperform Photo B and a red background may out perform a blue background, but you cannot assume that Photo A on a red background will outperform Photo A on a blue background or even Photo B on a red background. In short, test EVERYTHING.
Testing shouldn’t stop at your web design. Look for other opportunities to conduct focused, measurable A/B tests. This could include testing email subject lines or online ad creative.