A/B testing is a bit like being at the opticians. Even more so when they actually use the phrase “Do you prefer A or B?”
Believe it or not, websites are A/B testing on us right now. On you – yes you and you probably won’t even know it. Go to Facebook, for example, and try and add a video or photo. Do you see the difference in terminology between the below screenshots? Facebook are obviously interested in increasing the number of photos shared via their platform to increase engagement levels. Sometimes the smallest of changes can make the biggest difference.
To clarify what we are talking about when we refer to A/B testing, it is all about showing a proportion of the users design or functionality “A” and the other proportion of the users “B”. It’s not necessarily about what users prefer – you might prefer the design that has bigger images – but about which version converts better.
You can spend hours looking through sites like www.abtests.com and whichtestwon.com and seeing whether test A beat B or visa-versa. The most important thing to note is why. Why did users convert more on A than B because intuition and, sometimes even research, can’t predict action when in context. Can you really implement something that you don’t understand but that works better?
Take, for example, Etsy. Would you believe that their infinite scroll feature would fail? Probably not, instinctively, you would hypothesise that users would prefer to scroll down to find their products and not click to go to the next page, and the next page, and the next page etc.
I’ve outlined some notable A/B testing examples that are fascinating to read, so urge you to peruse the below:
When WashingtonPost.com editors put a story on the site they enter two headlines. Their system automatically A/B tests both headlines for the first few hours the story is live on the homepage, and then settles on the one that performs better.
Huffington Post go one better and do the same with headlines and images.
I’ve probably seen 25 different versions of the MailChimp homepage. You need only take a look at http://web.archive.org/web/*/http://www.mailchimp.com and go back in time to see different versions – or Google images – but it’s beautiful seeing the different versions to optimise their sign ups.
37Signals (Basecamp/Highrise) did an A/B test on the headline of their pricing page. What they found was that “30 day Free Trial on All Accounts” had 30% more signups than the original “Start a Highrise Account”.