In 2008 an employee of Microsoft had an idea: instead of loading the service in the same tab, open a new browser tab every time the user clicks on a Hotmail link. But Microsoft isn’t a company that jumps into decisions blind, even for such minor changes. So, they ran an A/B test to investigate how the update would influence user behavior.
The results exceeded all expectations. User engagement went up by 8.9%! The improvement was implemented first in Great Britain, and after another test was then expanded to the U.S. and later the world.
But Microsoft didn’t stop testing. Another hypothesis that required user approval was opening MSN search results in new tabs. Over 12 million users tried out the improvement in the United States as part of the A/B test. Clicks per user increased by 5%.
Opening links in new tabs was adopted by other Internet companies all over the world. Invented by Microsoft, this simple technique based on just a few lines of code resulted in significant improvement to user engagement.
But would the company have dared to experiment if they weren’t able to test the results of the change first? For Microsoft, one of the world’s largest tech companies, even the tiniest decrease in user engagement can cost millions of dollars.
Why is A/B testing so important?
Experiments drive progress. A/B testing is an approach that allows you to experiment wisely and without taking big risks.
Let’s say you have two variants of an element but cannot figure out which one will work better. You can guess, but you can also A/B test to find out which one can best achieve your aim. In web development, the main aim is to improve conversions or increase the number of users / an average check / time spent on the website.
The best way to test two different variations of the same website is to divide all traffic into two equal parts, follow the difference in conversions, and discern which version has better results.
Advantages of A/B testing
A/B testing should be a mandatory step before implementing any change to your website. The change may be minor, but consequences are sometimes much greater than expected, and those consequences are not always positive.
Still not sure if A/B testing is worth it? Here are the main advantages of starting any website redesign with A/B testing:
- Check your hypothesis without taking big risks
You can assume that changing the position of the CTA button will work better for your customers. But permanently implementing the change straight away can ruin your conversion. A/B testing allows you to test your hypothesis before making any permanent changes that can potentially affect website results. What’s more, without A/B testing you won’t be able to define whether a change in conversion was the result of your change in UI or some other factor.
- Choosing from multiple options
Not sure about the background color for your blog posts? Can’t decide between four different shades ranging from light yellow to turquoise? Let your users choose. You can run an A/B/n test that checks several variants of one element at once. In this case, the test will show you your users’ preferences. Attention! Sometimes, your users will like something completely different than you personally prefer, and that is ok. Listen to your users.
- Find weaknesses in your website
Sometimes weaknesses are hard to discover. Your website might constantly suffer from the wrong UX without you even realizing what the reason is. Building a new website from scratch is an overkill that wastes time and money, and the rebuild might not even address the weakness (because you haven’t figured out what it is). But A/B testing allows you to discover your mistakes for significantly lower expenses. And knowing the true reason for poor user experience will prevent you from making the same mistakes in the future.
What can be A/B tested?
Literally anything. Any element of your website that your users see, read, watch or click can be replaced by a variation and tested in the real conditions.
You can start with a title – the first thing that attracts your user’s attention. If you change the font-size or color, will it affect user behavior? What about experimenting with the call-to-action button? Will users press the green button more often than the red one?
Image position or size can also be very important. Similarly, the text on your page, the format of testimonials, the navigation type, the resolution of the video and so much more can be crucial.
Types of testing
There are two approaches you can use to test website variations and choose those that work better.
This type of testing is exactly what you would expect: you compare the existing version of your website with one (A/B) or several (A/B/n) variations. The distinctive feature of A/B/n testing is that variations of only one element are tested – for example, the color of the call-to-action button. This makes it easy for you to deduce exactly what caused an increase or decrease in conversion.
Like A/B/n testing, multivariate testing compares several versions of your website, but with multiple changes to each version. For example, every single version can have a different color, position, and text for the call-to-action button, and also different types of testimonials. Such testing makes it more difficult to differentiate which element created the most impact. At the same time, it allows you to find the best possible combination of elements.
Bandit and Adaptive Algorithms
Bandit algorithms are a variation of A/B/n tests. They don’t divide traffic into equal parts for all variations. The amount of traffic is updated in real time based on each variation’s performance. With time, the variation that shows worse results receives less and less traffic until the website completely switches to the winning version. This approach allows you to save potential conversions by reducing the frequency of less efficient variables without excluding them from the testing.
A/B/n and multivariate testing are often considered as an either-or choice. And in most cases, testing is limited to A/B/n only: researches show that companies run 10 A/B tests for every 1 multivariate test. But this means they are not maximizing the potential of variation testing.
One A/B test can show that the light-blue button works better than the yellow one, another test can deduce that the top right corner is a better place for that button than the top left. But this doesn’t necessarily mean that the light-blue button in the top right corner will be the best match. Sometimes, multivariate tests show that the top right corner, combined with the yellow button, performs better. To figure out these combined effects, you need to implement both types of testing.
Even the tiniest change to an existing online project can have big consequences. It can significantly improve conversions, but also completely ruin them. Never judge any change based on your personal feelings and intuition - sometimes those feelings are completely wrong.
Fortunately, there is an alternative: A/B testing, a powerful tool that can help you predict the results of every improvement you implement on your website. You can test your ideas without big risks, discover the weaknesses of your online project, and fix them in the best possible way.
Are your ready to unveil the true power of A/B testing? Follow our step-by-step guide to start testing your assumptions the right way. This comparison of two A/B testing tools may also come in handy.