The Do’s and Don’ts of Video A/B Testing

To enhance the performance and efficiency of any marketing channel and funnel, it’s routine for marketers to do something called A/B testing.

A/B testing, also known as split testing, is a kind of technique used to compare how two versions of slightly various products perform. Whichever one brings the best results is introduced into the market. It’s the most common type of tests with regards to video marketing, and, in fact, one of the most fundamental.

Why Should You A/B test your video?

When it comes to any creative process, there are mainly two categories of people that should always be kept in mind – the producer and consumer. The producing side consists of video creators, directors, etc. The main problem with releasing any subjective piece is that whereas the producers think the video is excellent, the consumer, i.e., the audience, may not feel the same way. Even if does get a proper response, who’s to say another method wouldn’t have produced even better results?

powerful

Marketing Tools

For videos that click.

Small tweaks to your video call to action could mean the difference between whether the next video leads to a conversion or not. Even if you thought you made the right decision, conducting a randomized test is the only way to know whether your new plan is going to be effective.

Aspects of your videos that should always be A/B tested

Video thumbnail

A video thumbnail is a static image that shows a preview of your video before viewers click on it. This tiny image can have a significant influence on the play rate of your video. It’s the primary way of the potential viewer judging whether or not the video content interests them.

Generally, the thumbnail is a shot from somewhere within the video. However, there is suggestive evidence to suggest that videos with a person in them are clicked on more frequently than those that don’t feature a person. It depends on your video, but its success can be measured as being relative to the number of people who clicked play on the video.

Video call to action

Once you’re done producing your lovely video, the whole point of putting it on the innerwebs is getting people to do something through it. The most common (and most efficient) way to achieve this is via a video call to action.

When it comes down to CTAs, even small details matter. You could keep factors like colors, size of the font and the font itself in mind. The more variables you’re able to test, the better of you will be in the end.

Video title

For a long time, we’ve known there’s a direct correlation between your video title and the amount of traffic your video gets. This explains the prevalence (and consequently, popularity) of clickbait and otherwise irrelevant content. One of the primary rules of video marketing is if you know what your target audience is searching for, you will be found. It’s all part of the SEO process, and the more adept you are at it, the more people are going to find your video.

The A/B Testing Dos and Don’ts

Do: Test a lot of different variables

There is almost an endless number of variables you can test out during your video marketing campaign. Size, color, video title, and templates are just some of the few variables you can test. In fact, the more you can test, the more you can refine your product according to the taste of your market. This will lead to better results than a hastily done test involving just one variable.

Don’t: Test all the variables at once

Efficient A/B testing takes a bit of trial and error. Part of this is the need to understand how different variables affect the attention span of your viewer or what impact in particular they have. For example, if you’re trying to find the most efficient video call to action, don’t modify the design, the font, the color and the message all at once. There’s no way of knowing which aspect of the change affected the results. Stick to one variable at a time so there should be no guesswork involved in the final phases of data compilation.

There are a few exceptions to this rule, i.e., changing two or more sets of variables and correlating the end results, but this is only relevant to people with a lot of people to test with. The team doing the tests also has to be experienced enough to compose, compile and analyze the data.

Do: Start with the most straightforward metrics first

A/B testing is a pretty easy process to explain. In practice, however, it can get complicated fast. When just starting out, test the most straightforward variables first: this could be the size of a button or its background color. There are a lot more complex tests you could conduct on your site, but the best way to approach the process is to start from the ground and work your way to the top. Besides, even the smallest changes can yield the biggest results.

The best example of this would be WikiJob. In 2017, they carried out an A/B test to find out whether the presence of testimonials would increase conversions on their page. It seems like a relatively mundane feature to have on your site on the best of days, but it led to a 34% increase in the number of sales versus without them. Starting with fewer variables is also a great way to test how skilled your team is and learning how to change things up on the spot. It also creates the perfect foundation for conducting more advanced tests.

Don’t: Be afraid to try more advanced testing

Every business is different, more so when it comes to their target audience and the kind of market they end up serving. Once you have conducted simpler A/B tests and determined what works best, don’t be afraid to perform more complex tests. These could be anything along the lines of an A/B/C test or even a complete overhaul in the design of your webpage. Pit the different designs against each other and keep whichever one behaves the best.

If you’ve been paying attention, this may seem conflicting when the topic of not testing variables at once is brought up. However, at larger scales, a lot more variables must be tested, and doing them all separately is entirely unrealistic. Even if it were possible, it would take too much time. As your team gets more experience in dealing with a diverse range of clients, you learn new techniques. Certain metrics can always be tested together and correlated.

Don’t: Assume every result is statistically significant

There are going to be a lot of things you will need to test. Don’t go in with the expectation that every single test you conduct will create results that are significant to your cause. Something that’s often pilfered all over the internet is how a single line can mean life or death for your website. While this claim may have some truth to it, after all, every single customer counts, the claim is a lot more inflated. You’re not going to A/B test millions of visitors just by changing how a single line is displayed to each one of them.

Besides which, in the event, you do make a change, and you do realize an increase in, say, the number of subscribers on your newsletter goes up. It’s not necessarily the change that caused an increase in subscribers. Statisticians often say correlation doesn’t equal causation. The same principle applies here.

Do:  Analyze whether results are statistically significant.

On which note, never assume whatever change you made had a direct impact on your business. You will need to dig a little deeper in your video analytics to check whether your tests held any merits. There are several ways of determining this. The most common way to regard tests is this: if the difference between your data before changing and after the change is small, the variable you tested probably doesn’t hold any water.

However, don’t be discouraged if your results don’t seem to be relevant. The beauty of A/B testing is it reveals both what your viewers care about and what they don’t. And, it being testing, it will involve a lot of trial and error. If there are more variables you feel can be tested, which is the most likely case for most people, go ahead and redo the whole thing. This time incorporate the lessons you’ve picked up along the way.

Do: Act on the end results

Once everything is done comes the time for action. If your results proved statistically significant, it’s time to implement whichever version of your content the viewers and users prefer the most. If it wasn’t substantial, you’re free to do as you please with the change as it doesn’t affect the viewers greatly, for the most part. It’s time to try something new instead.

powerful

Marketing Tools

For videos that click.

Originally published on August 2nd, 2018, updated on April 24th, 2019
The Blog

The Do’s and Don’ts of Video A/B Testing

by Simi time to read: 6 min
0