Stop A/B testing creative now? AI is killing traditional testing …

By John Koetsier May 30, 2024

Should you stop A/B testing creative right now? Maybe not, but in a few years you will probably look back on A/B testing as the stone age of creative optimization and wonder, like a kid looking at an old-fashioned rotary phone, did we actually do all of that manual labor?

Hit play, keep reading …

Mom knows best?

It’s kind of the ultimate parental slap-down. 

At a family dinner a few years ago, Ellad Kushnir Matarasso, now director of growth at, told his parents about the A/B testing he was doing to optimize the $100 million annual ad spend he was in charge of at his agency. 

His parents — both computer scientists — looked at him a little funny and asked, sort of like you asking your old-school uncle why he’s still using a point-and-shoot camera, why he was still doing A/B testing when there were much better options.

So much for the $100 million flex.

Fast-forward to today, and Ellad doesn’t have to hang his head anymore at family events.

AI knows best (no more A/B testing)

“When I joined Alison I was really excited to join a company that really introduces a new approach to creative testing,” he told me in a recent Growth Masterminds podcast

“So rather than having to constantly produce new concepts, new iterations to feed the beast, and then having to test them against each other, which means a lot of time and money spent on both production as well as media budgets, Alison takes a different approach.”

“So essentially once we connect to the advertiser’s ad account, our algorithms go frame by frame to automatically identify each and every element that appears within those creatives — it can be anything from colors, text, sound, characters, facial expressions, literally anything that you can think of that might appear inside those creatives …  and once we’ve mapped out all those elements, we then cross-reference them with the performance data from the campaign, which means that for whichever KPI or metric that you’re optimizing for, we can inform you which creative elements are making the most impact on performance.”

It sounds good. 

It sounds super high-tech.

And it sounds like AI, which it is. 

In fact, the company uses over 15 different AI engines to identify everything inside your creative elements and, instead of using one-off A/B tests, essentially runs a massively multivariate test on pretty much every minute component of your creative to see what correlates with success.

If it sounds a lot like something Google and the other major social and ad platforms might be doing in the background as they use AI to slice up your creative elements in a blender and then Dr. Frankenstein them back together in a million combinations to see what works best, well, there’s a reason for that.

Both of the biggest digital ad platforms on the planet actually use

“I’ll tell you a secret: all these platforms that you mentioned are using Alison,” Kushnir Matarasso says. “So essentially they themselves understand that sure, their algorithm helps you kind of throw in certain stuff into the mix and come up with something new. But in most cases there is a big gap between what they can achieve in terms of performance.”

So where’s the human now?

Using AI doesn’t mean the human goes away, Ellad says.

“We’re big believers in human creativity. Our objective is to amplify that using our technology.”

Part of the reason for that is even when AI knows something, it doesn’t know that it knows it, or know why it knows it. So humans can identify when something just doesn’t quite make sense, or a particular combination of creative elements would reflect poorly on the brand. 

Plus, creating a brand vision and building an expression of that which can really connect to a target audience of potential users/players/customers is still really on the carbon units in the marketing department, as opposed to their digital partners.

Creative is critically important, whether you’re using A/B testing or not

Creative is responsible for up to 89% of the success of your campaigns, according to a 2017 Nielsen study that Ellad cited. 

That’s huge.

Get creative wrong, and you basically just tossed your ad dollars right into a dumpster and set them on fire. And in an era of increasing signal loss, what creative can bring to your campaign success is even more important.

Should you look for best practices?

Interestingly, “we don’t really believe in best practices,” Kushnir Matarasso says.

That makes sense from a lot of perspectives: what works for 1 brand in 1 ad network in 1 point in time could be completely different from what works for you. But the other way it makes sense is that best practices are a way of averaging out things that don’t fail spectacularly. 

Modern creative optimization is about finding unicorns that succeed spectacularly.

And while best practices can help you not lose your job … they probably won’t find your next superstar ad either.

That said, one of the biggest mistakes Matarasso sees is overusing creative: using what worked in Google on TikTok, or what worked on Snap in AppLovin. Rather, you need to have a specific approach for each platform, he says.

Does this mean you should never do A/B testing?

Probably not.

But it does mean that the days of A/B testing’s full utility are probably more behind us than in front of us. Marketers will have to do what they have to do, based on the technology they have available to them and the data they can access.

But ultimately, if you can get more insight into all the minutiae of what’s working and what’s not working — and if you’re able to actually use that data intelligently — you probably want it.

Much more in the full podcast

Check it out on our podcast home page, where you can also find links to subscribe to Growth Masterminds on our YouTube page or on your favorite podcast platform.

Stay up to date on the latest happenings in digital marketing

Simply send us your email and you’re in! We promise not to spam you.