Paid social: To A/B or not to A/B?
How well do you know your target audience on social?
Let’s take a test.
- Do you know if they are more likely to engage with a product image, lifestyle image or video?
- Do you know if they are more likely to watch a video if the attached copy includes emojis?
- Do you know if they are more likely to click on your product link if you’ve included a hashtag?
- Do you know if they are more likely to click on a ‘Shop Now’ or ‘Learn More’ button?
- Do you know if they are more likely to watch a vertical, horizontal, or square video for a longer period of time?
If you know the answers to all the questions, you don’t need this blog. Unless you think you know the answers, from a gut feeling, and have no real proof – in which case you definitely need this blog.
We hear generalised social ad stats and facts all the time. Whether it’s advice to avoid product imagery, or that square videos perform better than horizontal on Facebook, or that younger audiences engage more with emoji-laden copy, we’ve all heard the tips and tricks. But how do you know that these general rules apply to your audience? The fact is, you don’t.
What is A/B testing and why should I do it?
A/B testing is a pretty old marketing tactic, used back in the days of letter mailers. It’s a way to find out your audience’s preferences through making tiny changes to your comms – in this case, social ads.
Essentially, when A/B testing, your audience is split in half and the two groups receive a different ad, with a really small change.
How small? It could be something like a different CTA, a new image, or tweaked tone of voice in the copy. Your options of what to change vary across platforms: On Twitter, it could be a new poll answer option. On LinkedIn, it might be a different subject line for your InMail.
Then, you monitor your ads and see which performs better. There could be a tiny difference in success rates, meaning your audience has responded well (or poorly) to both. Or, there could be huge differences, where one ad outperforms the other by miles.
This is why you make small changes. Ads that are completely dissimilar could perform differently for any number of reasons. By A/B testing with one tweak, you know that it’s your video shape, hashtag or CTA causing the different results. You can then try testing this with your audience a couple more times with other ads – until you confirm what works best.
A/B testing can be a long process, but luckily social platforms make it easy to do. ‘Quick duplicate’ options make adding a second ad far faster, and Facebook even offers an A/B testing function that does the hard work for you.
What should I A/B test?
What you should A/B test is different for each brand, but there are a few obvious starting points.
Firstly, are you spending a lot of time or money on content production? Then it’s worth making sure that this is the right content for your audience. Test different imagery and formats with your audience to build basic content creation guidelines, and ensure that you’re not wasting your efforts on poorly performing content.
Next, if you’re attempting to drive link clicks, definitely check your CTAs. Some audiences prefer a more direct form of marketing; “Buy Now” or “Shop Here”. Some prefer more subtle approaches; “Learn More”. The difference in link clicks between the two can be remarkable, so it’s definitely worth testing.
Finally, tone of voice. Lots of brands have really strict tone of voice guidance that hasn’t changed in years, which can be a huge mistake. After all, some of our ideal buyers are now Gen Z – and they don’t want to be marketed to in the same way as millennials. Try experimenting with your tone of voice, and, you can even go a step further and start splitting your audiences. Does the 16-25 audience need a different tone to 26-35? Is it time to start marketing to them separately? A/B testing will get you the answers.
A/B testing: The aftermath
There’s no strict ending to A/B testing. After all, you can continue to make tiny changes throughout an entire campaign. But, you will get to a point where the time and effort invested may not be reflected by improved results. Then, it’s time to take onboard your learnings and incorporate them into new ad creation – and also organic post creation. Considering you now know how your audience prefers to be communicated to, it would be insane to not take this onboard when creating organic copy and content.
But always remember that A/B testing isn’t a one-time thing. As mentioned, audiences change over time and we need to keep on top of their preferences. That could mean that you heavily invest time in A/B testing once a year, quarter, or month, depending on your budget. New campaigns could see different messaging, audiences, or content themes, which again you would need to test.
A/B testing may sound like a lot of work, but after your first campaign and the improved results you see, you’ll realise the time, money and energy was worth it.