As you’re reading this, the traditional way of doing SEO is slowly going away, being replaced by an entirely new form of SEO.
This shift is inevitable; it’s a matter of time.
In this article, I’ll attempt to explore what that shift is and what the future of SEO might look like.
Buckle up, it’s about to be a long read.

Here’s the main thesis:
– Search engine results pages (SERPs) are losing their efficacy as reliable information sources due to many factors, but one not discussed enough (especially in the SEO community) is the incoming contamination by AI-generated content that Google is not equipped to handle. Simultaneously, AI systems are growing in sophistication and are emerging as a new and better way to access digital information due to a supreme user experience. All of this points to a fundamental shift in what SEO is and how we’ll perform it in the future.
SEO 2.0 will represent a new approach to content and website optimization that focuses on making information accessible to AI systems rather than traditional search engines and optimizing for human readers.
Topics
TL;DR
In this article, I explore the impending shift in SEO from optimizing for human readers to focusing on making information accessible to AI systems. As search engine results pages (SERPs) become increasingly contaminated with AI-generated content, users are turning to AI systems for a superior user experience. SEO 2.0 represents a new approach to content and website optimization that prioritizes structured data, machine parseability, and data accessibility for AI systems.
This shift is driven by user frustration with the declining quality of SERPs, the exponential growth of AI-generated content, and Google’s struggle to filter low-quality content. The article discusses the challenges in detecting AI-generated content, Google’s lagging algorithmic countermeasures, and the cat-and-mouse game between AI content and Google.
The future of SEO is agent-centric information access, where specialized “knowledge agents” delivering tailored responses will become the primary information interface. Websites are evolving from destination pages designed for human visitors to structured data repositories for AI systems.
This article provides insights into the practical value and applications of this shift, outlining the main topics covered, and highlighting the unique framework of SEO 2.0. By understanding this new approach, readers can prepare their websites and content strategies for the future of SEO.
SERPs Are Losing Efficacy and Becoming Contaminated with Garbage Content
User Frustration with SERP Quality
Users are increasingly frustrated with the declining quality of SERPs. Searching for high-quality information online nowadays feels like going through a labyrinth of advertisements, thin affiliate content, and AI-generated material.
Click-through rates for informational queries have plummeted by 71% in just the last year as users find themselves unable to trust the results they’re seeing.”
One telling indicator is the habit of appending terms like “Reddit” to search queries, an attempt to find authentic, human-written content rather than accepting standard search results that consistently disappoint.
This behavior reflects a lack of trust in traditional search, showcasing how users are creating their own filters to work around a system that no longer provides them with what they need.
AI-Generated Content Flooding the Web
As the frustration with SERP quality grows, another shift is silently happening.
We’re already seeing more and more content being produced with AI, and that is only going to accelerate.
LLMs have made content creation so efficient and affordable that websites can now produce thousands of articles at minimal cost. Over half of all marketers now regularly use AI tools to generate long-form content, prioritizing production volume over depth and originality.
This exponential growth will alter the content ecosystem and, most importantly, the SERPs. The web was already struggling with poor-quality content saturation before AI, now that problem is only going to get worse.
Google’s Struggle to Filter Low-Quality Content
Despite implementing increasingly sophisticated algorithms, Google is fighting a losing battle against the flood of AI-generated and SEO-optimized content.

Google’s recent algorithm updates have specifically targeted websites reliant on AI-generated content, particularly penalizing those failing to demonstrate the famous E-E-A-T (experience, expertise, authoritativeness, and trustworthiness).
They’ve addressed “scaled content abuse” (mass-producing low-effort content), “site reputation abuse” (leveraging trusted domains to host low-quality content), and the repurposing of expired domains as content farms.
Yet, despite the efforts, low-quality content continues to dominate many search results as even Google’s own AI tools struggle to consistently differentiate between high-quality and low-quality sources, sometimes citing content farms alongside more credible references.
The Cat-and-Mouse Game Between AI Content and Google
AI Content Detection Challenges
Detecting AI-generated content presents a technical challenge that grows with each advancement in language models.
Current detection tools produce high rates of false positives (incorrectly flagging human-written content as AI-generated) and false negatives (missing sophisticated AI outputs).
Even essays written by humans before the advent of LLMs are sometimes being misclassified as AI-generated content
Algorithm Updates Lag Behind
Google’s algorithmic countermeasures have also been lagging behind the latest SEO tactics. AI-generated content brings this perpetual game of catch-up to another level.
The pattern has been consistent until today: Google identifies a specific manipulation tactic, develops countermeasures, and implements them through an algorithm update.
However, as LLMs’ capabilities to create high-quality content continue to progress, the line between what constitutes “human” or “AI-generated” content will blur until we reach a point where there’s no real difference.
LLMs are already capable of producing content that is almost indistinguishable from human content.
AI Agents and LLMs Becoming the Primary Information Access Point
User Behavior Shift
As traditional search quality declines, users are pivoting to alternative information sources.
Traditional search volume is projected to decline by 25% over the next year as users increasingly turn to AI systems like ChatGPT, Claude, or Google Gemini. AI-powered features are already gaining prominence within traditional search interfaces. AI Overviews now appear in nearly 40% of Google searches, up from just 25% less than a year ago.
Sites that manage to be cited by AI systems receive substantial traffic, while those that fail to be included in AI-generated summaries face existential threats to their business models.
This trend is accelerating as users discover the convenience and efficiency of AI-mediated information access.
Supreme user experience
Traditional search engines were designed around keyword/topic matching. They excel at finding pages that match the topic/keyword of the searcher but often fail to understand the nuanced intent behind complex queries.
AI agents, by contrast, are built to comprehend natural language and contextual nuance, making them significantly more effective for conversational information seeking.
Users can now ask complex questions that include multiple parameters, contextual elements, and nuanced considerations.
For example, an AI agent can effectively handle a query like “What’s better for a family with kids under three and a dog, Bryce Canyon or Arches National Park?“- a question that would typically produce disappointing results in a traditional search.
This conversational approach aligns more naturally with how humans think and communicate.
If there’s an important thing we’ve learned from the last few decades of technological advancement, it is that products providing a supreme user experience always win.
In our case, the difference between the user experience of using AI systems vs a Google search and digging into the SERPs can’t be larger.
SERPs are going away and are bound to be replaced by AI systems; it’s only a matter of time.
LLMs as Content Curators
There’s an additional layer to this discussion.
Unlike SERPs, which simply provide links to potentially relevant pages or AI overviews that answer informational queries, AI agents can synthesize information from multiple sources, offering direct answers and comprehensive summaries in a personalized way.
This capability directly addresses a major pain point associated with traditional search: the need to manually navigate through numerous pages and piece together information and the lack of personalization.
AI systems excel at processing and distilling complex information. They can take a lengthy research paper and generate a concise summary of its key findings or synthesize information from multiple articles to provide a holistic answer to a complex question and achieve all of this in a personalized way.
It’s no longer about what everyone is searching, but what YOU are searching and how that information is serving YOU.
This ability to synthesize information is already transforming how users interact with online content, offering a level of convenience that traditional search can‘t match.

The future points toward an “agent-centric information access” model where specialized “knowledge agents” delivering tailored responses will become the primary information interface.
This vision of millions of personalized AI agents evolving with user needs represents a departure from static SERP rankings and a reinvention of how we access information online.
SEO 2.0: Optimizing for AI Systems
Websites as Data Repositories
As AI agents increasingly bypass SERPs to extract information directly from websites, the fundamental role of websites starts to evolve.
Rather than destination pages designed for human visitors, websites are slowly transforming into structured data repositories for AI systems to extract information and serve users on different platforms.
The paper called “Agent-centric information access” shared below the article explores that same idea: websites in the future might be a place where we store information, but it is probably not far-fetched to assume that eventually the habit of visiting websites might disappear completely.
After all, 45 years of development of UX/UI practices have not been able to solve the fundamental problem of “Can I find the information I need in less than 30 sec?”. AI systems are already solving this.
SEO 1.0 vs. SEO 2.0
This is how SEO fundamentals are evolving:
Dimension | SEO 1.0 (Traditional) | SEO 2.0 (AI-Centric) |
---|---|---|
Primary Goal | Rank highly in SERPs | Be cited by AI system |
Key Metrics | Rankings, CTR, traffic | Extraction rate, citation frequency |
Content Focus | Keywords, human readability | Structured data, machine parseability |
Technical Priority | Crawlability, indexability | Data accessibility, fast processing |
Authority Signals | Backlinks, domain age | Expertise signals, cross-platform presence |
User Journey | Click → Read → Return to SERP | Ask → Receive direct answer |
Platform Focus | Google-centric | Multi-platform, agent ecosystem |
Content Volume | More content = more rankings | Focused expertise > quantity |
The Path Forward: Adaptation
The evidence is clear: we’re witnessing the decline of traditional search as the primary gateway to online information.
The combination of deteriorating SERP quality and the rise of more capable AI systems marks a fundamental shift in how information is discovered, accessed, and consumed.
The shift is already happening, but it likely won’t reach the point of SEO 2.0 in 2-3, maybe even 10 years.
As you’re reading this we’re at a convergence point that we could call SEO 1.5 in which we have to prioritise doing traditional SEO as this is still where most of our searches happen but also prepare ourselves for the world of tomororow, one in which our primary user will be AI systems and not humans.
Notes:
https://www.researchgate.net/publication/379208332_Is_Google_Getting_Worse_A_Longitudinal_Investigation_of_SEO_Spam_in_Search_Engines
https://www.alixpartners.com/insights/102jze5/the-future-of-search-ai-driven-disruption-and-diversification/
https://dl.acm.org/doi/10.1007/978-3-031-56063-7_4
https://lawlibguides.sandiego.edu/c.php?g=1443311&p=10721367
https://seo.ai/blog/are-google-search-results-getting-worse-study
https://arxiv.org/html/2502.19298v1
https://theacsi.org/wp-content/uploads/2024/07/24jul_search-social-media_study.pdf