I have been doing A/B Testing periodically for years and I’m usually surprised by the the results. Sometimes the results trump design principles, sometimes the results are counter-intuitive, but most of the times the results lead to valuable insights.
All this taught me that if I’m not doing A/B testing before and/or during and/or after the product lifecycle I’m usually missing critical insights that could significantly move the needle.
A/B testing can be daunting, but it doesn’t have to be. Here’s what you need to know:
- Get familiar with Google Analytics: At least a basic fluency in web analytics is necessary to understand and analyze online behavior. In case you’re not fluent with Google Analytics (GA) Google has some great videos on its GA YouTube Channel
- Establish Business Objections that Align with User Needs: If your business objective is to increase page views and you set up a test to drive customers to an external website your results will be out of sync. Don’t laugh I’ve actually seen this on major brand’s website. Make sure to define clear and precise business objection(s) that meet current customers or potential customers needs. If you haven’t done any qualitative research on what your customer needs are you run the risk of invalidating your test!
- Define your Target Audience or Segment: If your product / service has a log-in and user profile section getting demographic info should be baked-in. But what if you don’t have an user profile section? You can recommend a sign-up page or you can segment your audience by behavior segments:
- Unengaged – visitors bounced from site
- Interested – visitors doesn’t bounce, but views few pages
- Engaged – completes a micro conversion
- Converted – completes macro conversion
- Approach: Focusing on the whales (your biggest customers who have converted the most often) or underperforming segments that can increase revenues is a business decision. Most of the clients I have worked with usually go after the whales. But when underperforming sections are the focus more up-front qualitative research is needed to understand the customers needs, which will inform an hypothesis to A/B test.
- Tactics: Test One Elements at a Time – if you change several elements on a page it will be difficult to know which change is responsible for the move in traction. The more granular you test the sharper your results can become. If your testing multiple changes at once it is difficult to determine which element is moving the most impact.
- Beware of Testing Bias: you should be testing to learn which elements, copy, headline, CTA, etc. produce the optimal conversion. If you stack the deck and setup a test that you know will give you the results you want there will no learning. However, you can take advantage of certain biases your customers have and should maximize your efforts to capture that advantage. As long as you do in an ethical matter and provide something customers want there is nothing wrong with using your customers proclivities to drive conversions. This is known as Behavioral Economics and something I am currently taking a deep dive in.
Finally, I have assembled a few of my favorite articles and resources on A/B Testing.
24 A/B Test Results To Will Surprise You:
Though a little dated this article still stands the test of time with some surprising A/B testing results. If you’re thinking of getting into the conversion optimization this is a good start. Season veterans may already know most of the tips, but a quick scan may just surprise you and offer new inspiration for that project that needs to move the needle a bit further. Read the full article
Check out this short list of A/B Testing tools, techniques and case studies:
Let’s not forget one of my favorites Optimizely
– Techniques: kissmetrics.com/ab-tests-shocking-discoveries/
– Case Study: blog.optimizely.com/how-obama-raised…
This is by far not the last word on A/B Testing and as technology, customers and methods continue to evolve so should our understanding of this guidelines.
As I am always looking to improve my performance with A/B Testing I welcome your insights you can share in the comments.
Written by Marc Niola – The UX Acrobat
Follow Marc on Twitter: @MarcNiola