top of page

The most underrated metric in media planning? Knowing when to stop

Sep 12

3 min read

MMM is too slow for today’s fast-paced digital media cycles. Brands need to know in real time when a channel’s effectiveness is fading and when to stop spending.


I didn’t start a media measurement company 12 years ago because I love dashboards.

I just hate bad measurement.

Back then, all media teams had were neat-looking charts with no substance, dodgy CPAs and attribution models that rewarded the wrong channel.

Now, they are still relying on tools that look smart but tell the wrong story. Some are — let’s be frank — complete bullshit.

We want to stop guessing in a fragmented, digital world, where quicker decisions are increasingly important.

And, crucially, we want to know when to stop spending on media.


The truth about MMM

Marketing mix modelling (MMM) is too slow for how media works today. It was built for long-term budget setting: annual reviews, print, TV, radio, outdoor.

Digital media now absorbs over two-thirds of global adspend. Marketers can’t afford to wait six months for effectiveness reports; they have faster spend cycles, more pressure to justify channel choices and less room for waste.

Planners are increasingly stuck when MMM takes too long to run — it’s too expensive to refresh often and isn’t granular enough to help with real optimisation.

Worse, MMM fails to answer today’s most important question: where should the next pound go and when should we stop?


We need a GPS, not just a compass

Most media metrics — CPAs, CTRs, impressions — are just snapshots of performance and can mislead.

Response curves can show the shape of performance at different activity levels, measuring how outcomes (sales, visits, conversions) change with spend. They help answer two essential questions:

• What return are we getting at different spend levels?• At what point does another channel deliver more marginal value?

It’s subtle but important: dashboards tell you what happened last week, while response curves show what’s still worth doing this week.

This matters whether you’re an independent agency or global network: MMM still has value, especially for annual forecasting and explaining long-term brand effects, but it’s not an optimisation tool.

It doesn’t tell you how to respond to a £150k top-up mid-campaign and it can’t give you audience-level nuance without months of data wrangling.

MMM is a compass, not a GPS. Yes, we still need a map of long-term effectiveness, but avoiding costly short-term errors is vital too.

Meanwhile, most attribution models rely on stitched-together data from different platforms, each with its own world view. You need response curves that are built on actual consumer behavioural data, where each user’s full media exposure and outcome journey is tracked.

This means you can isolate real effects, compare platforms like for like and model spend scenarios with confidence.

Then you know when to change course, with evidence you can show clients or finance chiefs.


A lesson in diminishing returns

Let’s say you’re planning for a major UK food-delivery brand.

We ran response curve analysis for this category across three channels — TV, YouTube and TikTok — using single-source behavioural data (no first-party input required).

Where does the next pound deliver the most return? We found the best marginal return on investment mix for a £1m budget was:

• TV (£600k): Broad reach, still strong marginal returns• YouTube (£260k): Effective, but expensive at higher levels• TikTok (£140k): Super-efficient at low spend, but saturates quickly

But what if the budget was just £50k? Very different answer: spend all of it on TikTok.

At low spend, TikTok delivered outsized impact. Beyond that? Diminishing returns.

If this brand had relied on platform averages or last-click attribution, they’d likely have over-invested in the wrong place. With response curves, they could allocate by effectiveness at every tier of spend.

We see similar patterns for many other brands: TV has the lowest diminishing returns while TikTok has the highest. This applies to brands with smaller media budgets, too, because efficiency matters most when money is tight.

Effectiveness is not always about “doing better measurement”. It’s about making faster, smarter decisions that minimise the cost of delay or overconfidence.


Our media map needs better stop signs

Media is much more fragmented than it was 12 years ago, meaning the best plans are no longer just built on “what works”.

They’re built on knowing when something stops working and having the data to act. That something can stop working much sooner than you might think.

Our industry needs a better map of the limits of effectiveness. What worked for your last campaign might be wasting money in the next.



Want to get data on your own brand?

Write down the questions you have and we will come back to you

bottom of page