a-b-testing-advertising-copy-copywriter-collective

If you haven’t been asked to create multiple versions of an ad for an A/B test, you certainly will soon. A/B testing is a widespread process used to create more effective website designs, product features and ads. We’ve written this guest article for the Copywriter Collective in mind. Focusing on understanding the process of A/B testing advertising copy.

What is A/B testing?

Let’s start with the basics. A/B testing is a method of comparing two or more versions of a piece of content (such as an email, landing page, or ad) to see which performs better. As far as advertising tests go, they are used to test different versions of copy, headlines, calls to action and images.

The way these tests are set up differs greatly with the platform being used. Most digital platforms like Facebook Ads and Google Ads have their own integrated tools that allow marketers to run these tests automatically.

At other times, writing copy for more traditional mediums like newspapers involves more manual methods. For example, a publication produces two different versions of a magazine ad and rotates the ads with each issue. This would allow the advertiser to compare which of the two ads works best.

Why is A/B Testing Important for Copywriting?

Good copywriting combined with smart testing methods helps you understand how the copy connects with the audience and helps create more effective copy. While you will rarely be asked to set up these tests as a copywriter, you will certainly be writing the different versions of ads that will be tested.

Running tests is an opportunity for you to test new copy elements and answer some of your own questions about what copy to write. Understanding A/B testing allows copywriters to have more of a say in what is being tested. As well as having a bigger contribution to marketing teams.

3 Important A/B Testing Frameworks

It all starts with the objective of the test. Why are we running an A/B test in the first place? And what do we want to learn? Here are the three main types of tests that you’ll see be used over and over again.

Concept Testing

The objective of this type of A/B test is to compare completely different ideas. For instance, there could be a seasonal ad promoting a holiday special. To measure the effectiveness of this holiday concept, a company could run a concept test. By simultaneously running the regular non-seasonal version of the same promotion with the holiday ad, the result of the test would tell us which of the two ads performed better.

Copy Pre-testing

This type of A/B test tries to test how an audience responds to a specific concept before additional design and marketing resources are invested into it. Hence the name, ‘pre-test’. Essentially, a low-cost version of an ad, which might only include the new copy, is tested. If the ad does well in a pre-testing phase, then more resources can be invested in developing the concept fully. These tests can be run with focus groups or on digital ad platforms like Facebook.

Iterative Testing

The purpose of this A/B test is to make small tweaks to a top-performing ad to try to improve its performance. Since an ad is doing well, only one or two specific elements of the copy are changed. The new copy change can sometimes be as small as a single word. Most companies will dedicate a portion of their budget to improving current ads through iterative tests and another portion to testing new ads through concept tests.

 

These ad testing frameworks always involve collecting data. With this data in hand, it’s a lot easier to decide which ad copy to use and which to discard. Subjectivity and personal preference of decision-makers always make their way into the final decisions, but the purpose of creating and designing tests is to really find out what works best with the audience.

Creating the Right A/B Test For Your Copywriting Task

Once you’ve picked an objective and know which of the three types of test you are running, you can start thinking about how this test will look. We’ve broken test design into three important factors.

  1. Platform Limitations: As we mentioned earlier A/B tests are channel specific. What that means is that you should be aware of how you run these tests on each platform. Some digital ad networks will, for instance, allow you to easily test multiple ads at once. If you’ve written ad copy for Google responsive search ads you might already know that up to 8 headlines and 4 descriptions can be tested at the same time.
  2. The Copy Element Tested: Now the second aspect of a test is deciding exactly what to test. A concept test is pretty simple, we’re testing completely new ad copy! But iterative testing is trickier. Think of what you want to learn from your test and what you would need to change to your copy to get that information. Would your audience be more captivated by persuasive copy? Test it out by changing a few words to your existing ad copy. 
  3. The Measurement Metric: In step 2, you’ve decided what it is you want to test, now how will you know if this test improved your performance or not? For a billboard or newspaper ad, you could measure the copy that generated the most calls. For a Twitter ad, you could measure the copy that generated the most website visits. Be aware that sometimes tests will result in conflicting metrics. One version of an ad might get more likes, while the other will get more clicks. Establishing the measurement metric ahead of testing makes deciding the ad winner easier.

When choosing a metric, try to use a metric that happens more often. For instance, using Facebook likes as a measurement metric will be easier than using a metric that happens more rarely like a sale. The challenge is picking a metric that is both easily measured and meaningful for the company.

Interpreting Your Test Results

Once you’ve conducted your A/B test, you need to interpret the results. This involves comparing the performance of each version of your asset and determining which performed better. If you’ve designed your test properly, this part shouldn’t be too difficult.

Naturally, after learning which ad worked better you’ll also learn about the preferences of your audience. Taking our earlier example, if the holiday concept did not perform better on your test than on the equivalent non-seasonal concept, you would know that your audience is not interested in seasonal promotions for your service or product. Use this information to improve the ad copy in the future.

Before ending our guest post, we’d like to mention a tool our company is developing called Flowin. We’re a team of copywriters and advertisers based out of Montreal who found reading performance reports to be time-consuming and unintuitive. To solve our problems we developed a web platform that automatically runs your A/B tests and even provides suggestions and new ad copy ideas. If trying out new tools is something you’re interested in don’t hesitate to reach out!

Continue reading: Should we A/B test politics?

William GrigatAbout the Author

William Grigat is the founder of Flowin.so, an AI company developing software for advertising agencies. Being both a programmer and marketer has given him an opportunity to try new technologies and develop solutions to bring them to an industry he loves.