THE LEAD

Nielsen's 2024 Annual Marketing Report found that only 54% of marketers are "confident" in their ability to measure ROI across channels. That number has been flat for 3 years.

The reason it hasn't moved is structural. Attribution models measure exposure, not causation. Every channel that touched a customer before a purchase gets credit for that purchase. Google says Google did it. Facebook says Facebook did it. The retargeting vendor says retargeting was the hero.

They're all right, technically. And the sum of their claims adds up to roughly 3 times your actual revenue.

This is what incrementality testing exists to fix. It answers the one question attribution can't: would this conversion have happened anyway?

The method is simple. Pick a channel. Turn it off for 2 weeks. Watch what happens to revenue. If revenue drops, the channel was doing real work. If revenue stays flat, you were paying for conversions that were going to happen regardless.

A marketing team ran this test on their display retargeting. They'd been spending $40,000 a month. The attribution dashboard showed retargeting participating in 35% of their conversions. It looked essential.

They turned it off. Revenue didn't move. $480,000 a year on attribution theater.

The test took 14 days. It cost nothing. And it answered a question their $200K analytics stack couldn't.

THE FRAMEWORK: The 2-Week Holdout Test

You can run this without a data scientist, a testing platform, or a budget.

Week 0 (baseline). Pick your second-highest spend channel. Log total revenue and conversions for the previous 7 days. Use whatever happened last week. Don't cherry-pick a good week.

Weeks 1-2 (holdout). Turn the channel off completely. No reduced budgets. Kill it entirely for 14 days.

Measure. Compare revenue and conversions to baseline.

Interpret the results.

Revenue dropped more than 10%? The channel was driving real incremental revenue. Turn it back on and test a different one next.

Revenue stayed within 5%? The channel was probably taking credit for organic demand. You just found budget to reallocate.

Revenue went up? More common than you'd think. Usually means the channel was generating low-quality traffic that cluttered your funnel.

What to expect by channel. Display retargeting tends to show low incrementality (10-15% actual vs 30-40% attributed). Branded paid search captures demand that organic would have handled (5-15% real impact). Cold paid social holds up well, especially B2B. Email nurture sequences need a 4-week holdout because the impact takes longer to show.

Your results will vary. That's the point.

THIS WEEK ON THE BLOG

The full breakdown of incrementality testing for marketing teams who don't have a PhD, a data science team, or a $50K testing platform. The post walks through the holdout test step by step, benchmarks by channel type, and the one conversation with your CFO that changes everything about how marketing and finance work together.

THIS WEEK ON PROFESSOR LEADS

Five shorts, one theme: prove your channels work (or don't).

"The Pause Test" introduces the holdout concept in 35 seconds. "The Channel That Didn't Survive" tells the $480K/year retargeting story. "The Double-Count Problem" shows why attribution dashboards inflate revenue by 2-3x.

"The Template" walks through the holdout setup with zero tools required. "The CFO Conversation" reframes the holdout as the meeting that changes the marketing-finance dynamic.

Free incrementality testing template: https://professorleads.com/tools/incrementality-guide

WORTH YOUR TIME

Nielsen's 2024 Annual Marketing Report. The study behind the 54% confidence number. Despite increased spending on measurement tools, the confidence gap has barely moved since 2021. Worth reading the executive summary for the benchmarks alone. Read it: https://www.nielsen.com/insights/2024/annual-marketing-report/

Rand Fishkin on attribution's biggest lie. SparkToro's piece on why attribution models are getting less accurate as privacy regulations tighten, cookie deprecation progresses, and dark social grows. His proposed alternative (triangulation between attribution, incrementality, and brand tracking) aligns with the holdout methodology. Read it: https://sparktoro.com/blog/attribution-is-dying/

Harvard Business Review on marketing measurement. A 2023 HBR piece on why randomized controlled experiments (holdout tests) are the gold standard. The authors argue that most A/B testing in marketing is flawed because it doesn't account for selection bias. Read it: https://hbr.org/2023/01/a-refresher-on-randomized-controlled-experiments

ONE THING TO TRY THIS WEEK

Pick your second-highest spend channel. Run one query: what percentage of conversions attributed to that channel also had touchpoints from 2+ other channels? If the answer is above 50%, that's your double-count problem in one number. A holdout test is the only way to replace the question mark with an answer.

William DeCourcy

Professor Leads

Forbes Business Development Council contributor

#ProfessorLeads #LeadGeneration #B2BMarketing #B2CMarketing #PerformanceMarketing #MarketingMetrics

Keep Reading