⚡ TL;DR -- 6 Things We Found
- 1. 73% of AI citations come from pages updated in the last 90 days. Stale content gets dropped.
- 2. Pages with FAQ schema are 3.2x more likely to be cited than pages without structured Q&A.
- 3. Perplexity cites 4x more sources per response than ChatGPT -- making it the easiest platform to win.
- 4. 82% of brand mentions in AI come from third-party sources, not the brand's own website.
- 5. AI visibility scores fluctuate 15-25% week-over-week -- one-time measurement is meaningless.
- 6. The top 10% of brands capture 68% of all AI mentions in their category.
Fresh Content Wins. Everything Else Decays.
The single biggest predictor of whether AI will cite a page is how recently it was updated. Pages that haven't been touched in 90+ days see a steep decline in AI citations -- regardless of their domain authority or backlink profile.
Citation rate by content freshness
This doesn't mean you need to rewrite every page monthly. Our data shows that even small updates -- adding a new statistic, updating a date, refreshing a paragraph -- are enough to signal freshness. AI models appear to weight the last-modified signal heavily.
What to do: Audit your top 20 pages. If any haven't been updated in 90+ days, refresh them -- even minor updates count. Add a "Last updated" date visible on the page.
Structure Beats Authority -- FAQ Schema Is the Single Biggest Lever
We expected domain authority to be the top signal. It wasn't. Content structure -- specifically FAQ schema, clear heading hierarchies, and quotable definitions -- had a stronger correlation with AI citations than backlink count.
Citation likelihood multiplier by page feature
Why? AI models need to extract a clean, quotable answer. FAQ pages hand them exactly that -- a question and a self-contained answer. It's the lowest-friction content for an AI to cite with confidence.
Want to see where your brand stands?
Track your AI visibility across ChatGPT, Perplexity, Claude, and Google AI.
Not All AI Engines Are Equal -- Platform Differences Are Massive
One of the most surprising findings: the same page can be cited by Perplexity and completely ignored by ChatGPT. Each platform has different retrieval patterns, citation rates, and source preferences.
Average citations per AI response
Perplexity
Cites the most sources. Prefers editorial content, Reddit, and review sites. Easiest platform to win citations on. Refreshes results frequently.
ChatGPT
Fewest citations but largest audience. Prefers authoritative corporate pages and well-known domains. Hardest to break into. GPT-5.3/5.4 reduced citations further.
Google AI Overviews
Heavily weighted toward pages already ranking in top 10 organic results. If you rank on Google, you'll likely appear in AI Overviews. If you don't, you won't.
Claude
Moderate citation rate. Shows strong preference for technical documentation, academic sources, and long-form content with clear expertise signals.
82% of Brand Mentions Come from Third-Party Sources
Most brands assume that optimizing their own website is enough. Our data shows the opposite: the vast majority of AI brand mentions come from pages you don't control.
Top source types for brand mentions
This has major implications. Brands need a dual strategy: optimize their own pages for citability and build presence on the third-party sources AI engines trust. A strong G2 profile, active Reddit presence, and PR coverage in editorial sites directly impact AI visibility.
AI Visibility Is Volatile -- Weekly Swings of 15-25% Are Normal
Unlike Google rankings, which shift gradually, AI visibility can swing dramatically from week to week. A brand might be cited in 80% of relevant queries one week and 55% the next -- without any changes to their content.
Why this happens: AI models don't always retrieve the same sources. Query interpretation varies slightly between runs, model updates change retrieval behavior, and the competitive landscape shifts as new content gets published. The result is inherent volatility that makes point-in-time measurement unreliable.
Want to see where your brand stands?
Track your AI visibility across ChatGPT, Perplexity, Claude, and Google AI.
Winner Takes Most -- The Top 10% Capture 68% of All Mentions
AI search has a severe concentration problem. In every industry we analyzed, a small number of brands dominate the AI conversation while the rest are essentially invisible.
Share of AI mentions by brand tier
This mirrors what happened with SEO in its early days -- the brands that invested first captured dominant positions that became increasingly difficult to challenge. The same dynamic is playing out in AI search, but faster. With AI models updating weekly, the window to establish a strong position is narrowing.
Methodology
This report is based on analysis of 50,000+ AI-generated responses collected between January and March 2026 across five platforms: ChatGPT (GPT-5), Perplexity, Claude, Google AI Overviews, and Gemini. Responses were collected from 1,200+ brands across 18 industries using standardized prompts designed to trigger product recommendations, comparisons, and informational answers.
Brand mentions were extracted using Geonimo's NLP pipeline with fuzzy matching. Citation sources were classified by domain type (Corporate, Editorial, UGC, Review). Content freshness was determined by crawling cited pages and checking last-modified headers and visible date signals.
Visibility scores, platform breakdowns, and citation rates referenced in this report are aggregated across all tracked brands. Individual brand results will vary. All data was collected and processed using the Geonimo platform.

