What is A/B testing?
One of the main objectives of any business that uses digital marketing is to increase conversion rates (which generally range from 1% to 3% when marketing activities are effective). But what does the term “increasing conversion” actually mean? It is not only associated with sales volume, as many believe, but it also refers to the desired situation of users or potential clients doing what was previously established as the business objectives. This could mean filling out a form on a landing page, subscribing to a newsletter, or buying a certain product or service.
Achieving a satisfactory conversion rate is not easy since the whole process is influenced by factors as diverse as the quality of traffic and the offered products or services, activities of the competition, and website layout or design. And these are only a few conditions that may hurt or improve the conversion rate.
Many marketers focus most of their initiatives on increasing website traffic. However, once you have achieved that, and thus many people are willing to visit your website, it is essential to make the most of it. And by that, we mean turning web traffic into meaningful actions that have a real impact on your other marketing efforts and your bottom line in general. This is where A/B testing becomes very important, as it allows you to examine user behavior and make decisions based on the collected data. A/B testing is widely used in conversion rate optimization, also known by its acronym CRO.
In this post, we are going to look at what A/B testing is, what kind of elements you can examine using this method, what benefits it brings, and some common mistakes to avoid. Not only that, but we are also going to give you some tips regarding A/B testing tools.
What is A/B testing?
An A/B test consists of carrying out an experiment to compare two different published versions of a landing page differing by only one element. The test’s purpose is to check which of the two is more effective. The control version A and the variation B must be shown at random to different website visitors, with the advised frequency of each one to occur being 1 in 2 visits.
It is important to have an initial hypothesis that’s used to define why a certain variable will be tested.
A/B testing marketing is used in many different digital promotional areas. Regardless of whether you want to test ready-made landing pages, their elements, colors, or corporate identity, performing A/B tests depends on your objectives and needs. Only one variable is compared in each A/B testing experiment. Thanks to a comparison of the two different versions, the following marketing assets can be examined and optimized:
- Web design
- Features and functions that have been modified
- Landing pages
- Specific elements on landing pages
Once an A/B testing experiment concludes, its results should be measured and analyzed. However, the test should not be considered finished until the probability of the results changing has been reduced. That is, until the sample reaches a representative size. Otherwise, false positive outcomes of A/B testing may cause the experiment to fail and lead to a negative result for the business. But we will get to some common A/B testing mistakes later in this piece.
The time required for A/B testing results to be representative will depend, above all, on traffic to the website. A/B testing can take a few days or several weeks.
What elements can be examined with A/B testing?
We’ve already listed possible marketing assets that can be tested and optimized using A/B testing. Now let’s see which elements’ performance can be improved thanks to A/B testing marketing. In this case, we will look at the elements of a landing page.
Actually, there are a vast number of elements on a landing page that can be tested depending on your needs. Once a landing page has been chosen, variations of some of the following elements can be created to undergo A/B testing:
- Structure – the distribution of elements or sections of a landing page can be modified or even changed to prioritize certain content over others.
- Navigation – sections can be reduced or expanded in order to understand whether users prefer simplified or expanded amounts of content.
- Copy – the content itself can include different sized bodies of text that will either attract readers' attention more or less.
- Visuals/images – these elements have a clear impact on the general perception of a landing page and its attractiveness, so it is interesting to discover what attracts your users the most.
- Forms – changing the copy of certain fields, removing optional ones, or using different colors can increase the conversion rate (which, in this case, directly influences the successful submission of forms).
- Call to action – the color, size, or interaction of a button can considerably influence the conversion rate.
- Buttons – just like CTAs they play an important role in the user journey, which includes form submission and page views per session, for example.
Check out the best landing page examples to get some inspiration.
Types of A/B testing
Now you know that you can test numerous elements of a landing page via A/B testing, but there are also a number of other test types to choose from. The most important ones may include the following.
This is the typical A/B or A/B/x test. It is a randomized experiment that uses two or more variants of the same landing page. Everything occurs on the same URL. The original version (A) is always used, but the variants B to x contain one or more elements that have been modified from the original.
This type of test is the most common. It is often used, for example, when to measure how many clicks each of the variants gets when changing the color or the copy of a button.
Split testing is often mentioned along with A/B tests or even used interchangeably with it, which is why we want to explain the term. Split testing happens when you test a control version of your website against a completely different version using a different URL in order to determine which of the two your website site visitors respond to best. It is often used when a redesign is performed, and both the structure and the UI design is completely new with respect to the original version.
Multivariate test or multi-variant tests
This type of test is performed when several elements on the same web page are modified simultaneously, to check which layout is the most effective and yields the best results.
This type of test measures the impact of each of the elements that have been modified and how they have affected the previously defined objective.
Benefits of A/B testing
A/B testing brings many advantages for online marketers. Generally speaking, it allows for a subjective comparison that focuses on the perception of a landing page (or any other marketing asset) by the target audience to take place independently of one's own point of view. Thanks to numerous A/B testing tools, comparisons can be carried out even without technical expertise, providing clear results that can be applied immediately at the end of the process.
However, A/B testing is only meaningful when you want to compare specific elements with each other. As such, if many aspects are changed at the same time, the analysis of simple A/B testing will no longer be so clear-cut. Additionally, there is always a danger of making too many demands on users or confusing them when making and unpublishing changes. It is therefore advisable to target tests at new website visitors. Finally, there is the question of statistical relevance. In the case of websites with low traffic, it’s more difficult and tedious to obtain truly meaningful A/B testing results.
Summary of A/B testing benefits:
- Objective comparisons of different marketing assets.
- Checking one hypothesis per test.
- Reflection of the interests of the target group.
- Accessibility of A/B testing tools.
- Possibility to carry out clear and simple analyses.
- Option to immediately implement the results.
These landing page templates will help you make the most of A/B testing.
How to assess the results of A/B tests?
Running a test and getting results is just the beginning of the journey. Even if the new variant gets a better conversion rate, how do you know that it’s actually a better variant?
That’s where statistical significance comes into play. It determines how likely it is that the same results would be repeated instead of just being a one-time occurrence. There are a few factors that influence statistical significance in A/B testing. One of them is the traffic volume. The more traffic you test your change on, the more likely it is that the result is significant.
When it’s 10 people who see each version of your page, it’s hard to ascertain whether the test is a fluke or not. But when your data pool is over in the thousands, it’s a lot more likely that the results are based in reality instead of being random.
The other factor is the conversion rate. When one variant performs a lot better than the other, there is a bigger chance that the test is statistically significant. Keep in mind that both of these metrics should be taken into consideration together.
There is a way to accurately measure the significance of your A/B test. You can find lots of calculators online, but the one by SurveyMonkey should do the trick in your case. Just enter your traffic and conversion stats into the calculator, choose your level of confidence, and you will get a result.
Common A/B testing mistakes
A/B tests, like any form of analytical work, are prone to mistakes. What’s more, even one mistake can influence the outcome of months-long tests and lead to incorrect conclusions. That’s why we want to focus on some of the most common A/B testing mistakes and explain the reasons for them, so that you can avoid making the same ones over and over again and thus save time and your marketing budget.
1. Changing the control version during A/B testing
Making changes to the control version (the original, A variant) is a bad idea that should be avoided at all costs, especially if these changes affect the elements you are testing. The test results would mean practically nothing if the original version was changed during the testing process.
2. Modifying several elements at once
A/B testing is defined as testing a variation of a landing page that is different from the original version by only one element. One A/B test, one variation. When several different elements on the same page are being tested at the same time, we are no longer talking about A/B testing but a more advanced multivariate test. If you perform A/B testing and make several changes to a landing page then you are undermining the experiment, and it will be very difficult to interpret the results once the test is over. Why? Because you will not know which element influenced the results you obtain.
Suppose you want to use A/B testing instead of multivariate testing. In that case you should start by testing the most radical changes first and then more subtle ones afterwards, always creating one variation at a time and comparing it with the control version.
3. Including your most loyal visitors in A/B testing marketing
If you have a high rate of recurring visitors, it is best not to suddenly surprise them with a radically different version of your website or landing page. Such users are already loyal to you, but that may be negatively affected by a drastic change on the website even if the new version is a winner in the end.
4. Finishing A/B testing too early or allowing it to run for too long
Each test should be given a statistical reliability rate. This is a confidence level that is calculated by taking into account the duration of the test and the amount of traffic assigned to it. At the same time, you should stop your tests when they have achieved their objective. Waiting for gathering some marginal data won’t improve the final result of your A/B testing.
5. Choosing the wrong KPIs
Make sure that your KPIs fit the objective you have in mind. If you choose a KPI that is too far from the target, you will obtain a result that has little to do with your objective. Another common mistake is to choose only one KPI when the test may impact many other metrics. It is even possible that the test will improve the KPI being monitored while worsening others, but you don’t notice immediately.
6. Not being honest when interpreting data
For your tests to be effective, you must accept the results no matter what they are. Of course, this is easier said than done when you see that a variation you've been working on for weeks is worse than the control version. But the beauty of A/B testing is precisely that it tells you straight which version of your website works best for your users, even if it is not the one you prefer personally.
A/B testing with Landingi
We’ve already mentioned that reliable A/B testing tools can help you carry out A/B tests. And that’s where Landingi comes to your rescue. Landing is a landing page creator that you can use to not only build landing pages, but also optimize their performance and increase conversion rate.
Here’s what features you can use to create the well-performing and highly-optimized landing pages.
Split your landing page, change one thing only, compare the two versions, and choose the most effective one. A/B testing is one of the most popular conversion optimization solutions for a reason, as when carried out properly it guarantees outstanding results.
As already mentioned, optimizing a landing page for conversion means keeping an eye on its conversion rate. Using the landing page builder by Landingi, you are able to measure the number of visits on each of the landing pages that takes part in your A/B testing, control conversion rates for every variation in a given period of time, and drive educated conclusions about the performance of each marketing asset that’s being tested.
Landingi also provides different kinds of widgets and extras that you can build your landing page with to attract the attention of your visitors. You can choose from dynamic content like popups adjusted to certain conditions or opt for advanced personalization, which enables you to present tailor-made landing page elements to your users.
Give your tests time
To wrap up, there’s one really important tip you can’t forget about. One of the biggest mistakes marketers make is to end A/B testing too early. Not only does ending a test early waste your efforts, but it also undermines the statistics you've collected so far. So give your tests time and be humble when analyzing the results. You owe that to your visitors!