What Actually Changed in Google (And What Stayed the Same)
Google has not replaced traditional search. It has added a layer on top of it. The blue links still exist. Organic rankings still matter. The technical and authority foundations that determine organic performance are still the prerequisites for everything above them. The businesses panicking about SEO being “dead” are misreading what happened. The businesses ignoring what happened are becoming invisible in a growing proportion of queries.
What changed is this: Google now intercepts a growing share of search queries before the user sees any links. AI Overviews appear above organic results for informational queries. Google AI Mode generates synthesised answers that draw from multiple sources without the user needing to click anywhere. If your content is not selected for those answers, you are not present in that moment — regardless of your ranking position. You have not lost a ranking. You have been bypassed.
When Elizabeth Reid, Google’s Head of Search, introduced AI Mode at Google I/O 2025, she described the mechanism directly: “AI Mode isn’t just giving you information — it’s bringing a whole new level of intelligence to search.” The specific technique she named — query fan-out — is the mechanism that determines which content gets selected. One search becomes 5 to 11 simultaneous sub-queries. Results from all of them are scored and merged. The content appearing consistently across multiple sub-query results gets cited. Content relevant to only one query, regardless of how well it ranks for that query, contributes to fewer citations.
The Three Things That Still Work
Before addressing what is new, it is worth being precise about what has not changed — because much of the noise around AI search implies that traditional SEO is obsolete. It is not. The following still work exactly as they always have and remain the prerequisites for any AI visibility at all.
Technical SEO. If your pages cannot be crawled and indexed, they cannot be retrieved. Crawlability, indexation, Core Web Vitals, mobile performance, structured data — these are entry requirements for every form of visibility, traditional and AI. A page that fails to index will not appear in organic results and will not be retrieved for AI answers. Technical health is the ground floor. The AI floors above it are inaccessible without it.
Authority signals. Backlinks, domain authority, brand signals, and entity corroboration still determine whether AI systems trust your content enough to cite it. A technically perfect page on a low-authority domain is less likely to be selected by AI systems than a well-structured page on an established, corroborated entity. The trust layer has not been replaced by AI search — it has become more consequential, because AI systems performing selection are more risk-averse than traditional ranking algorithms.
Relevant, comprehensive content. AI systems still reward content that fully covers a topic. The principle that your content should be the most helpful resource on its subject has not changed. What has changed is how “helpful” is evaluated — by machines extracting fragments, not humans reading pages.
The Building Your Buyers Are Exploring
Think of search visibility as a building under construction. The ground floor has been standing for twenty-five years: traditional Google, established rules, a known inspection regime. Most businesses that invest in SEO have some presence here — some solidly, others precariously.
The first floor opened recently. Google AI Overviews and AI Mode are built directly on the ground floor’s index — the same foundations, different logic applied above them. If your site is well-indexed with solid authority, you may already appear on this floor as a byproduct of your existing SEO work. But appearance here is not automatic, and as AI Mode intercepts more queries, the cost of being absent rises.
The second floor is where standalone LLMs — ChatGPT, Perplexity, Claude — operate. Access to this floor requires additional work: entity corroboration across external sources, structured content that passes the extractability tests AI systems apply, and a citation footprint across the third-party sources those platforms retrieve from. The second floor has a temporary ladder — the rules are still forming — but buyers are already up there asking questions and receiving recommendations.
The question “how do I rank in Google?” is a ground-floor question. The right question in 2026 is: “how do I ensure my content is accessible, trusted, and structured across all three floors?” The floors are built in order. You cannot skip the ground floor to reach the second. But you cannot ignore the upper floors and call the building complete.
The Third Requirement: Being Selected
This is the requirement that has emerged since 2024 and that almost no website has systematically addressed. Selection is different from ranking. Ranking determines your position in a list. Selection determines whether AI systems extract your content and include it in a generated answer.
Content can rank without being selected — if it is written for human readers scanning linearly, with long introductions, buried answers, and sections that depend on surrounding context to make sense. AI systems do not read linearly. They retrieve individual passages and evaluate them in isolation: does this passage answer the sub-query? Does it contain a specific, verifiable data point? Does it name the entity making the claim? Can it be cited as a standalone answer without the surrounding page?
A page that passes these tests across every section gets extracted across multiple fan-out sub-queries and accumulates a higher retrieval score. A page that fails these tests — regardless of how well it ranks — contributes little to AI-generated answers. Google’s guidance since February 2023 has stated clearly: “We reward high-quality content, however it is produced.” CITATE defines what high quality looks like when a machine evaluates it — the passage-level criteria that determine whether content gets extracted and cited.
Where to Start
The starting point is a diagnosis, not a prescription. The right actions depend entirely on which floor your site is currently on and which requirements are unmet. A site with weak technical foundations needs ground floor work first — AI optimisation applied to an unindexed page accomplishes nothing. A site with strong organic performance but zero AI citation presence needs a different intervention entirely.
For most businesses in 2026, the honest diagnostic is: strong ground floor (they have invested in SEO), partial first floor (AI Overviews sometimes include them by accident), and no second floor presence (standalone LLMs cannot verify their entity and do not recommend them). The gap is in the middle layer — content extractability and entity corroboration — not in the technical or authority foundations that most SEO investment has targeted.
The AI Visibility Action Plan sequences the full diagnostic by business type and maps the specific actions at each layer. The AI Discovery Stack shows where content selection sits relative to the other layers. The AI Visibility Audit applies the diagnostic to your specific site.