Kontent.ai loves: A/B tests

A/B testing, showing two versions of a page to two audiences and analyzing which performed better over a specific period of time, is an excellent way to figure out how you can increase conversion on your website.

Jozef Falis

Jozef Falis

Published on Apr 7, 2021

Performing A/B tests

You create a lot of content for your customers, have visually appealing pages, and use CTA buttons, but are you sure that’s what your target group wants? Using A/B testing, you can easily check whether the way you work and create content is the best approach to attract your target audience. 

We at Kontent.ai created the headless CMS for the purpose of connecting it to various devices, so it does not come with its own A/B testing functionality. However, within the microservices architecture, we can easily integrate several A/B testing tools. The advantage is that these tools deal purely with A/B testing, so their conclusions are way more accurate than those of tools included in traditional CMSs.

A/B testing goals

First of all, we need to set a goal based on what we’re going to test. If we want to see which of two slightly different versions of a form will perform better, then, without any doubt, our experiment will focus on an increase in the conversion rate. On the other hand, if we test our tone of voice in a blog post, the results won’t be as straightforward. Nonetheless, we can easily deduce them using metrics such as time spent on the page, bounce rate, or exit rate. But where can one track these metrics?

A/B testing tools for Headless CMS 

At Kontent.ai, we mainly use the Google Optimize tool for A/B testing, which suits us best. It works perfectly with the headless CMS, and the A/B test rendering does not disturb the user in any way. Setting up the A/B test is very simple, and if the marketer also knows HTML basics, they can move individual elements on the page without the help of a web developer. Using good A/B testing tools, you can set as a trigger UTM parameter, JavaScript code, cookies, but also behavior variables, location, or technology.

The tool, therefore, gives us a number of options. Imagine we’re running a global campaign and want to find out whether people who come to our website from a search campaign in the UK like the tone of voice used on our website. Preparing such an A/B test is a piece of cake, similar to changing the layout for those who go to our site via Facebook ads. However, it all depends on the traffic we generate on the page that we want to optimize. It’s impossible to draw conclusions from an A/B test performed on a page visited by 100 people. To obtain conclusive results from an A/B test, I recommend evaluating at least 1,000 visits, ideally 1,000 visits per variant.

A/B test in practice 

We’ve tried a lot of experiments on kontent.ai. Some time ago, we wanted to increase the trial conversion rate on our pricing page. So we used Google Optimize to move the CTA button up.

Tested variants on our pricing page

In less than a month, we had the results of the A/B test, and we were able to implement the change on the website. The findings indicated that moving the CTA up on the pricing page could bring us more conversions than the original version with the CTA at the end of the pricing column.

Conclusion

It’s important to note that A/B testing is not personalizing the site based on the data we collect about users. A/B testing tools do not know whether the user on the other side is our customer or partner. However, they know how to answer questions about regional variations, UX, and user content and are thus great for checking our hypotheses about whether they reshape websites the best they can and serve us as we would like them to. 

Check out other articles from this series:

Subscribe to the Kontent.ai newsletter

Get the hottest updates while they’re fresh! For more industry insights, follow our LinkedIn profile.