You can run ads in AI Overviews. You can run ads in Microsoft Copilot. Soon you'll be able to run ads in ChatGPT.
But can you actually measure any of it?
Not really. And that's a problem nobody's talking about enough.
The Measurement Gap
Google's official position on AI Overview ad reporting: "Google Ads currently doesn't offer segmented reporting when ads show within Search AI Overviews."
Let that sink in. Ads in AI Overviews have been live for over a year. They now appear alongside roughly 40% of AI Overview SERPs. And Google provides zero dedicated metrics for this placement.
You can't see:
- • How many impressions came from AI Overview placements
- • Click-through rate specifically for AI Overview ads
- • Conversion rate for AI Overview traffic
- • Cost per acquisition from AI Overview placements
- • Whether your ads appeared above, below, or within the AI Overview
All AI Overview ad performance gets lumped into "Top Ads" in your reporting. That's it.
AI Max: The Attribution Shell Game
If you thought AI Overview reporting was bad, AI Max has an even bigger problem.
Google claims AI Max delivers 14-27% more conversions. But advertisers are finding that many of these "conversions" aren't incremental — they're just being claimed from existing keywords.
Here's what's happening: AI Max uses "inferred intent" matching. When a user types a partial query and clicks an autocomplete suggestion, AI Max can attribute that conversion even if your exact match keyword would have captured it anyway.
The result: AI Max gets credit for conversions your existing campaigns were already generating.
Without manual de-duplication analysis (pulling search term reports and cross-referencing against your existing keyword coverage), you can't tell which conversions AI Max genuinely created vs. which it simply claimed from keywords you already had.
What You Actually Can Measure
Not everything is unmeasurable. Here's what actually works:
Overall performance trends
Compare campaign performance before and after AI Overview rollout. If CTRs dropped 30% in July 2025 when AI Overviews peaked, that's signal.
Branded search correlation
Track whether branded search volume increases when your AI visibility improves. This is indirect but measurable.
AI referral traffic
Check Google Analytics referrers for chatgpt.com, perplexity.ai, bing.com/chat. This traffic is growing and trackable.
Incrementality testing
Google lowered the thresholds for conversion lift studies in 2025. You can now run incrementality tests at lower spend levels.
Third-party SERP tracking
Tools like GrowByData, Semrush, and others track AI Overview presence and ad placement. You can't get this from Google directly.
How to Build Your Measurement Stack
Since Google won't give you the data, you need to assemble it yourself.
Layer 1: Platform reporting (limited but necessary)
Use Google Ads reporting as your baseline, but don't trust it as single source of truth. Look for:
- Top Ads performance (this is where AI Overview ads hide)
- Search impression share (to understand competitive pressure)
- Search terms reports (to audit AI Max keyword expansion)
- Conversion lag analysis (AI-influenced journeys may be longer)
Layer 2: Third-party attribution
Platforms like Northbeam, HockeyStack, or Wicked Reports provide cross-channel attribution that doesn't rely solely on Google's reporting.
Layer 3: Incrementality testing
This is the gold standard. Run conversion lift studies to measure whether your campaigns actually drive incremental conversions.
Layer 4: AI visibility tracking
Monitor your presence across AI platforms: citation rate, share of voice vs. competitors, sentiment in AI-generated answers.
Layer 5: Marketing Mix Modeling
For larger budgets, MMM provides a statistical approach to measuring channel contribution without relying on click-based attribution.
Practical Recommendations
1. Stop expecting granular AI data (for now)
Google has stated reporting improvements are coming, but there's no timeline. Plan around current limitations rather than waiting.
2. Run incrementality tests quarterly
This is the only way to validate whether your campaigns actually drive incremental conversions.
3. Track correlations, not causation
You can't prove that AI visibility caused a conversion. But you can track whether AI visibility correlates with business outcomes.
4. Budget for uncertainty
When you can't precisely measure ROI, you need to manage risk differently. Set aside 10-15% of budget for experimental placements.
The bottom line: You can advertise in AI search. You just can't fully measure it yet. Build the best measurement stack you can with available tools. The platforms aren't going to fix this for you (not yet). Build the infrastructure yourself.






