Google's AI Mode appears to rely on a separate retrieval layer rather than the live web in the way traditional Search does, according to experiments by Dan Petrovic of DEJAN that have drawn fresh attention in the SEO community. Petrovic said his tests suggest AI Mode is drawing on a proprietary content store, which helps explain why a page can be visible in Google Search yet still be missing, stale, or inaccessible in AI Mode. The issue matters because AI Mode is no longer a side experiment: it has become a major search surface, and marketers have little visibility into how its underlying content source is refreshed.
Petrovic's first test involved deleting a page and checking whether AI Mode would still retrieve it. It returned a 404. He then restored the page, but AI Mode continued to behave as if the page did not exist, even though classic Search still showed it as indexed and ranking. A second test was even more telling: Petrovic created a page containing a hidden instruction that would prompt any AI visitor to return a specific phrase. Gemini, the standalone product, responded as expected, but AI Mode did not appear to access the page at all. On that basis, Petrovic argued that AI Mode is not reading content from the live web at answer time.
That interpretation has been given added weight by court filings in Google's antitrust case, which surfaced details about a proprietary system called FastSearch. Reporting by Search Engine Land said the filings describe FastSearch as a faster but lower-quality way of grounding Gemini and AI Overviews, using RankEmbed signals to generate abbreviated, ranked results from a smaller document set. Unlike classic Search, which uses a broader mix of signals, FastSearch is designed for speed and semantic relevance. The filings also indicate that Google limits what third parties can see of the system, making independent testing difficult.
The distinction between a live fetch and a served layer is not merely technical. In practical terms, it means a publisher can update or remove a page on the open web while AI Mode continues to surface old or incomplete information. That creates obvious problems for product pages, compliance-sensitive content, and time-critical announcements. It also means that being indexed in Google Search may no longer be enough to ensure visibility in AI responses, even when the page is live and eligible for ranking.
The debate widened after Chris Long of Nectiv reposted Petrovic's findings on LinkedIn, prompting comments from SEO practitioners and LLM specialists. Some argued the evidence points to a distinct serving layer with its own freshness and selection rules rather than a completely separate index. Others said the practical result is the same: AI Mode can lag behind the public web for days or weeks. One commenter warned that removed content could still be served in contexts where it should no longer appear, while another said the gap between Search and AI Mode shows that visibility in one does not guarantee visibility in the other.
For publishers and SEO teams, the lesson is that AI search requires a different playbook. Traditional indexing remains necessary, but it may not be sufficient for AI Mode inclusion. Google has already added AI Mode to its robots meta tag documentation, giving site owners some control through nosnippet directives, yet that affects serving rather than the underlying content pipeline. As AI-driven search becomes more central to discovery, the industry may need to treat freshness, passage selection, and content durability as separate optimisation problems, not just an extension of classic ranking.
Source Reference Map
Inspired by headline at: [1]
Sources by paragraph:
- Paragraph 1: [2], [4]
- Paragraph 2: [2], [4]
- Paragraph 3: [3], [6], [7]
- Paragraph 4: [2], [4]
- Paragraph 5: [1], [2], [5]
- Paragraph 6: [1], [3], [5]
Source: Noah Wire Services