Post-truth ad measurement with Hannah Parvaz, Sara el Bachri, David Vargas, Eran Friedman
Can you trust your ad measurement data?
Yeah. Sure. Absolutely.
And no. Totally not. LOL. Are you even serious right now?
Post-truth ad measurement megapanel
I just convened a megapanel on ad measurement. The question burning in the depths of my soul: can you trust your marketing data? When Meta says you got 20,000 installs, is that true? Can you take it to the bank? When TikTok says your campaign was awesome, is that accurate?
On the megapanel we had some of the best of the best in adtech insight:
- Hannah Parvaz
- App marketer of the year
- Former Curio, Uptime, Drinki marketer/head of growth
- Leads Aperture agency
- Sara el Bachri
- Gaming growth consultant
- Former Gameloft UA manager
- Leads SHAMSCO, her agency
- David Vargas
- UA consultant
- UA manager for Splitmetrics
- Partnership manager for Acorns
- Eran Friedman
- Cofounder and CTO at Singular
- Serial entrepreneur
Hit play and keep scrolling …
The question of whether you can trust your ad measurement data is a tough 1 right now because almost all measurement is now modeled.
We’re talking AEM on Meta. ADC on TikTok. The new Integrated Conversion Measurement (ICM) on Google. All kinds of probabilistic measurement in pretty much all the networks. And, for mobile marketers, data in App Store Connect and the Google Play Console. Plus, of course, all your first-party data: what you 100% know actually happened in reality because you’re measuring it yourself on your own owned apps, sites, and platforms.
Singular CTO Eran Friedman and I chatted about it a few weeks ago: all the data points that are now available are a bit of a gamechanger. They’re actually really really good.
But there is a problem.
They’re not exactly an easy button. In fact, quite the opposite.
Because, sadly, none of them agree.
Here’s how I summed up what I heard from the panelists:
“All measurements are invalid. All measurements are valid. There is no truth. Everything is true. You can’t trust anything. You have to trust everything.”
What does that mean?
Digging in: the problem with post-truth ad measurement
David Vargas kicked off the ad measurement megapanel without even realizing it by posting this to LinkedIn a few weeks ago:
Here’s just 1 campaign. It’s measured 4 ways. And each measurement is different.
- SKAN: 463 installs
- CPPs in App Store Connect data: 552 installs
- App Referrer: 2,054 installs
- MMP: 1,329 installs
I mean if you got 4 different measuring tapes from the local hardware store for a home reno project and got this kind of variance, you’d write a strongly-worded letter to the manufacturer, wouldn’t you?
This is the current state of ad measurement.
And there’s not really anyone to write a strongly-worded letter to.
But it does and can make sense, if you trust everything and trust nothing. (So post-modern, I know.) Keep reading …
Make it make sense (please, pretty please)
The solution is found in 1 single quote from the marketing measurement megapanel, thanks to Hannah Pavaz:
“There is no truth anymore. It’s not about truth now, it’s about triangulation.”
Not enough? Need more?
OK, here’s another quote, this 1 from David Vargas:
“You have to stitch all the breadcrumbs together to see what’s really going on.”
And yet another, from Eran Friedman:
“The discrepancies themselves can give you insight into user behavior.”
Let’s let Sara el Bachri have her turn too.
“ Basically we must corroborate and check different sources for data, and crosscheck different sources of data.”
That’s the easy answer, she says. (Easy, right?) The more difficult answer, she adds, is knowing which of the many sources of data to use in cross-checking others, and how to use all the different sources. To add to the degree of difficulty, the right mix can vary between apps, games, verticals, scale, and monetization methodology.
In other words, you need a solution for the solution.
How can you find that?
Scroll down a bit …
First, there is some good news about ad measurement
There is some good news I’ll get to first, before the solution to the solution.
Here it is: platforms under-report.
Like massively.
“ I’ve seen in every single case across many, many different accounts, and we’ve checked this significantly,” says Parvaz. “The platforms are underreporting and it’s a worst case scenario.”
The worst-case scenario for platforms is pretty good for you, though. In her testing, she’s seen up to 4.4X actual results of ad campaigns versus platform-reported numbers. This is massive. This is getting 4,400 installs but Meta only says it sent you 1,000. This is literally huge.
You can test this for yourself, Parvaz says.
Set up a Custom Product Page so you can track exactly the results on just that page via App Store Connect. App Store Connect should know exactly, precisely, completely, and totally accurately how many app installs happen via this CPP. Now funnel traffic to it from Meta, Google, TikTok, Snap, and other ad networks. (Maybe try 1 at a time per CPP so your data is super clean.)
Finally, compare the numbers from the ad network side versus the App Store Connect side.
“ I’ve never seen it not have the custom product page data be higher than it is inside the platform,” Parvaz says.
But that’s not all. In a lot of cases, that’s only capturing direct click through attribution, she adds. There’s view-through that adds to the numbers … even if they aren’t captured directly by a very specific Custom Product Page.
So the good news about ad network based ad measurement is that they’re under-reporting and giving you more than they’re claiming. Which, I think we can all agree, is nicer than the alternative.
There’s some even better news in our post-truth marketing measurement world
But there’s even better news to come.
While you can do this kind of work for each ad network to first verify that you’re getting what it’s saying you’re getting, and secondly, check for any kind of organic or view-through uplift on those claimed numbers, you can get Singular to do it for you.
Because … Unified Measurement.
Singular does it for you. It’s not red, but it still is kind of a big easy button.
Because Singular tosses marketing spend data, ad delivery data, ad network claim data, SKAN & IDFA data on iOS, GAID data, IDFV data, Google referrer data, your own first-party in-app activations and engagement data, your own first-party revenue data, and quite a bit more into a big fat blender and comes up with Unified Measurement.
OK, it’s not a blender.
It’s a little more sophisticated than that.
But it’s a fact that I’ve seen multiple occasions recently where Singular customers see exactly the kinds of discrepancies we’re talking about. That causes them to dig into each of the datasets. And they invariably discover that the Singular modeled result in Unified Measurement is actually the most accurate available.
“ If you’re looking for the dataset that is the most real-time, the most granular, the most actionable on the day-to-day operational side, that’s typically that user level modeled kind of information coming from your MMPs dashboard,” says Friedman.
Adding AMM from the Meta side, the also new ICM from the Google side, there’s actually a good outlook for the future of mobile measurement.
Which is why, in spite of the broken tape measures and the resulting massive confusion, it’s kinda almost a measurement golden age.
As long as you remember David Vargas’ words:
“I always say: trust nothing and everything at the same time.”
And … use the differences to your advantage.
Much more in the full megapanel
There’s so much more if you watch or listen to the whole thing.
Subscribe to our podcast, and follow us on YouTube.
In the full megapanel you’ll hear about which segments and demographics are more “clicky” than others, and which you need to use view-through to measure because they just don’t tend to click. (Which also impacts platforms and ad networks, by the way.) You’ll also hear about MMM, incrementality, probabilistic, and a ton of other smart cool stuff that will make your job easier, and you better at it.
Here’s some more of what you’ll find:
- 00:00 Introduction to Growth Masterminds
- 00:30 Meet the megapanel
- 03:11 David’s LinkedIn Post: The Catalyst
- 04:54 Panelists Share Personal Insights
- 08:06 Diving into Measurement Challenges
- 12:12 Hannah on Underreporting Issues
- 15:08 Eran’s Perspective on Measurement
- 18:29 Platform Differences in Accuracy
- 19:53 Trust in Data: A Paradoxical Approach
- 20:32 Navigating Paid Campaigns with Probabilistic Methodologies
- 21:49 Real-Time Data Optimization
- 24:06 Inconsistencies in Attribution and Their Impact
- 27:40 The Return of Meta’s AMM
- 33:18 The Competitive Landscape of iOS Advertising
- 35:14 Final Thoughts and Best Practices