Here's a question most performance marketers are afraid to ask: how much of your ad spend is actually working?
Not the spend that generates impressions or clicks or a respectable-looking CTR in a dashboard screenshot. The spend that produces revenue. The rest — and in most accounts, it's substantial — is money handed to ad platforms while your media buyer stays comfortable with metrics that look good but don't tell the truth.
These are the 5 media buyer performance metrics that separate accounts being run with discipline from accounts being run on autopilot. Pull your numbers. The gaps will be obvious.
Impression Share Lost to Budget
Most buyers watch spend pacing. Almost none watch impression share lost to budget — the percentage of auctions you should have won but didn't because you ran out of money at the wrong time of day.
If this number is above 20%, your budget allocation is off. You're winning the cheap morning auctions and missing the high-intent evening slots when your audience is ready to convert. That's not pacing — it's leaving ROAS on the table by accident.
A competent buyer catches this within a day. Most don't audit it at all until a client asks why CPAs spiked.
Creative Decay Rate
Every ad creative has a lifespan. Audiences see it, frequency climbs, CTR drops, CPA rises. This is not a surprise — it's physics. But most media buyers don't measure when it happens or how fast.
Creative decay rate tracks how quickly your top performers degrade: CTR week-over-week, frequency thresholds that predict cost spikes, and the lag between "this creative is dying" and "we finally paused it." For most accounts, that lag is 7-14 days of overspend on already-dead creative.
Multiply that by every campaign you run, every month. The number gets uncomfortable fast.
Audience Overlap Rate
Running five ad sets targeting "lookalike 1-3%", "lookalike 3-5%", "interest: fitness", "broad", and "retargeting"? Congratulations — you're probably competing against yourself in 30-40% of auctions, driving up your own CPMs and reporting it as "competitive market conditions."
Audience overlap is one of the most common sources of wasted ad spend in mature accounts, and it's almost never proactively audited. Buyers add new ad sets. They rarely audit how new sets cannibalize existing ones. Platform tools to check overlap exist and are rarely used.
Time-to-Optimization After Anomaly
Your CPA doubles on a Tuesday afternoon. When does someone notice? When do they act? Most accounts: 18-36 hours later, during the next scheduled check-in or when a client emails asking what happened.
Time-to-optimization after anomaly is the single clearest measure of how actively managed your campaigns actually are. In a manual operation, this number is always measured in hours — because humans sleep, have meetings, manage multiple clients, and check dashboards on a schedule.
Every hour of delay on a spending anomaly is direct budget loss. A $500/day campaign running at 3x target CPA for 24 hours isn't an inconvenience — it's a significant and measurable failure.
Cross-Channel ROAS Attribution Accuracy
Ask your media buyer what your true blended ROAS is across all channels. Then ask them to show you the last-touch vs. first-touch vs. data-driven attribution comparison, and how they adjust bids based on the difference.
Most can't. Not because they're incompetent — because running accurate cross-channel attribution is genuinely complex and time-intensive to maintain. It requires correlating data across platforms that all claim credit for the same conversions, removing overlap, and making bidding decisions based on the actual incremental value of each channel.
What actually happens in most accounts: last-click attribution drives budget decisions, channels that assist conversions are systematically underfunded, and the "winning" channels on paper are winning because of how you're counting — not because of how they're performing.
The Pattern Behind All Five
Look at these metrics together and a pattern emerges: they're all things a diligent buyer could track, but in practice rarely does at the frequency required to act on them before damage is done.
That's not a people problem. It's a capacity problem. A human managing 10+ accounts across multiple clients and platforms simply cannot monitor every metric at the granularity required to prevent waste. They triage. They prioritize. They miss things.
An autonomous AI agent doesn't triage. It monitors everything, all the time, and acts the moment a threshold is crossed. That's not a marginal improvement in ROAS optimization — it's a structural one.
If you want to understand the broader shift happening in performance marketing — not just how AI catches waste, but how it's replacing entire teams — read: How AI Agents Are Replacing Media Buying Teams.
Stop Losing Money to Metrics Your Buyer Ignores
Get early access to an AI agent that monitors all five of these — and acts on them — around the clock.