As you’re reading this, the traditional way of doing SEO is slowly dying, replaced by an entirely new form of SEO. This shift is inevitable; it’s mostly a matter of time.
In this article, I’ll attempt to explore what is happening and what the future of SEO might look like.
Buckle up, it’s about to be a long read.

Here’s the main thesis:
– Search engine results pages (SERPs) are losing their efficacy as reliable information sources due to a variety of factors, one not discussed enough (especially in SEO) is the incoming contamination by AI-generated content that Google is not equipped to handle. Simultaneously, AI systems are growing in sophistication and are emerging as a new and better way to access digital information due to a superior user experience. All of these point to a fundamental shift in what SEO is and how we’ll perform it in the future.
SEO 2.0 will represent a new approach to content and website optimization that focuses on making information accessible to AI systems rather than traditional search engines and optimizing for human readers.
Topics
TL;DR
I’m exploring the shift from traditional SEO to SEO 2.0, where the focus is on optimizing for AI systems rather than humans. The decline in the quality of search engine results pages (SERPs) and the increasing use of AI-generated content are driving this change.
In the future, AI systems will become the primary information access point, offering a supreme user experience and personalized responses. Websites are evolving into structured data repositories for AI systems to extract information, and the traditional SEO practices of ranking highly in SERPs and focusing on human readability will become less important.
SEO 2.0 prioritizes data accessibility, fast processing, and structured data over rankings, CTR, and traffic. As we move towards this new approach, it’s crucial to adapt and prepare for the AI-centric information access model of the future.
SERPs Are Losing Efficacy
User Frustration with SERP Quality
Users are increasingly frustrated with the declining quality of SERPs. Searching for high-quality information online nowadays feels like going through a labyrinth of advertisements, thin affiliate content, and AI-generated material.
Click-through rates for informational queries have plummeted by 71% in just the last year as users find themselves unable to trust the results they’re seeing.”
One telling indicator is the habit of appending terms like “Reddit” to search queries, an attempt to find authentic, human-written content rather than accepting standard search results that disappoint.
This behavior reflects a lack of trust in traditional search, showcasing how users are creating their own filters to work around a system that no longer provides them with what they need.
AI-Generated Content Flooding the Web
As the frustration with SERP quality grows, there’s another shift. AI has lowered the barriers to content production. LLMs have made content creation so easy and affordable that websites can now produce thousands of articles at minimal cost.
This exponential growth will alter the content ecosystem and, most importantly, the SERPs. The web was already struggling with poor-quality content saturation before AI; now that problem is only going to get worse.
Google’s Struggle to Filter Low-Quality Content
Despite implementing increasingly sophisticated algorithms, Google has been slowly losing the battle against low-quality SEO-optimized content flooding the SERPs.

Low-quality content continues to dominate many search results as even Google’s own tools struggle to consistently differentiate between high-quality and low-quality sources, sometimes citing content farms alongside more credible references.
The Cat-and-Mouse Game Between AI Content and Google
AI Content Detection Challenges
Detecting AI-generated content presents a technical challenge that grows with each advancement in language models.
Current detection tools produce high rates of false positives (incorrectly flagging human-written content as AI-generated) and false negatives (missing sophisticated AI outputs).
Even essays written by humans before the advent of LLMs are sometimes being misclassified as AI-generated content
Algorithm Updates Lag Behind
As Google’s algorithmic countermeasures have been lagging behind the latest SEO tactics, AI-generated content brings this perpetual game of catch-up to another level.
The pattern has been consistent until today: Google identifies a specific manipulation tactic, develops countermeasures, and implements them through an algorithm update.
However, as LLMs’ capabilities to create high-quality content continue to progress, the line between what constitutes “human” or “AI-generated” content will blur until we reach a point where there’s no real difference.
LLMs are already capable of producing content that is almost indistinguishable from human content.
AI Agents and LLMs Becoming the Primary Information Access Point
User Behavior Shift
As traditional search quality declines, users are pivoting to alternative information sources.
Traditional search volume is projected to decline by 25% over the next year as users increasingly turn to AI systems like ChatGPT, Claude, or Google Gemini. AI-powered features are already gaining prominence within traditional search interfaces. AI Overviews now appear in nearly 40% of Google searches, up from just 25% less than a year ago.
Sites that manage to be cited by AI systems receive substantial traffic, while those that fail to be included in AI-generated summaries face existential threats to their business models.
This trend is accelerating as users discover the convenience and efficiency of AI-mediated information access.
Supreme user experience
Traditional search engines were designed around keywords and topical matching. They excel at finding pages that match the query of the searcher, but often fail to understand the nuanced intent behind more complex questions and user needs.
AI agents, by contrast, are built to comprehend natural language and contextual nuance, making them significantly more effective sources for information.
If there’s an important thing we’ve learned from the past few decades of technological advancement, it’s that products providing a supreme user experience tend to win.
In our case, the difference between the user experience of using AI systems vs Google search and digging into the SERPs can’t be bigger.
LLMs as Content Curators
There’s an additional layer to this discussion.
Unlike SERPs, which simply provide links to potentially relevant pages or AI overviews that answer informational queries, AI agents can synthesize information from multiple sources, offering direct answers and comprehensive summaries in a personalized way.
This capability directly addresses a major pain point associated with traditional search: the need to manually navigate through numerous pages and piece together information.
AI systems excel at processing and distilling complex information. They can take a lengthy research paper and generate a concise summary of its key findings, synthesize information from multiple articles to provide an answer to a complex question, and achieve all of this in a personalized way.
It’s no longer about what everyone is searching, but what YOU are searching and how that information is serving YOU.
This ability to synthesize information is already transforming how users interact with online content, offering a level of convenience that traditional search can‘t match.

The future points toward an “agent-centric information access” model where specialized “knowledge agents” will deliver responses tailored to our needs.
SEO 2.0: Optimizing for AI Systems
Websites as Data Repositories
As AI agents increasingly bypass SERPs to extract information directly from websites, the fundamental role of websites will starts to evolve.
Rather than destination pages designed for human visitors, websites will slowly transform into structured data repositories for AI systems to extract information and serve users on a different platform.
The paper called “Agent-centric information access” shared below explores that same idea: “websites in the future might be a place where we store information, but it is probably not far-fetched to assume that eventually the habit of visiting websites might disappear completely.”
After all, 45 years of development of UX/UI practices have not been able to solve the fundamental problem of “Can I find the information I need in less than 30 sec?”.
AI systems are solving this problem right now.
SEO 1.0 vs. SEO 2.0
This is how SEO fundamentals are evolving:
Dimension | SEO 1.0 (Traditional) | SEO 2.0 (AI-Centric) |
---|---|---|
Primary Goal | Rank highly in SERPs | Be cited by AI system |
Key Metrics | Rankings, CTR, traffic | Extraction rate, citation frequency |
Content Focus | Keywords, human readability | Structured data, machine parseability |
Technical Priority | Crawlability, indexability | Data accessibility, fast processing |
Authority Signals | Backlinks, domain age | Expertise signals, cross-platform presence |
User Journey | Click → Read → Return to SERP | Ask → Receive direct answer |
Platform Focus | Google-centric | Multi-platform, agent ecosystem |
Content Volume | More content = more rankings | Focused expertise > quantity |
The Path Forward:
The evidence is clear: we’re witnessing the decline of traditional search as the gateway to information.
The combination of deteriorating SERP quality and the rise of more capable AI systems marks a shift in how information is discovered, accessed, and consumed.
The shift is already underway, but it won’t reach the point of SEO 2.0 in 2-3, maybe even 5 or 10 years.
As you’re reading this, we’re at a mid-point where SEO is in its 1.5 stage.
We’d have to continue to prioritise traditional SEO as this is still where most of the searches occur, but prepare ourselves for the world of tomorrow, one in which our primary users will be AI systems and not humans.
Notes:
https://www.researchgate.net/publication/379208332_Is_Google_Getting_Worse_A_Longitudinal_Investigation_of_SEO_Spam_in_Search_Engines
https://www.alixpartners.com/insights/102jze5/the-future-of-search-ai-driven-disruption-and-diversification/
https://dl.acm.org/doi/10.1007/978-3-031-56063-7_4
https://lawlibguides.sandiego.edu/c.php?g=1443311&p=10721367
https://seo.ai/blog/are-google-search-results-getting-worse-study
https://arxiv.org/html/2502.19298v1
https://theacsi.org/wp-content/uploads/2024/07/24jul_search-social-media_study.pdf