SEO 2.0 – The future is optimizing for AI systems and not humans

SEO is undergoing a fundamental shift. As you’re reading this the traditional way of doing SEO is slowly going away, being replaced by a new form of SEO. This shift is inevitable, there’s nothing we can do to stop it, it’s a matter of time.

In this article, I’ll attempt to explore all that is currently happening and where that might lead us.

Buckle up, it’s about to be a long read.

Here’s the main thesis:
– Search engine results pages (SERPs) are losing their efficacy as reliable information sources due to many factors, but something not discussed enough (especially in the SEO community) is the incoming contamination by AI-generated content that Google is not equipped to handle. Simultaneously, AI systems are growing in sophistication and are emerging as a new and better way to access digital information due to a supreme user experience. All of this points to a fundamental shift in what SEO is and how we’ll perform it in the future.

SEO 2.0 will represent a new approach to content and website optimization that focuses on making information accessible to AI systems rather than traditional search engines and optimizing for human readers.

1. SERPs Are Losing Efficacy and Becoming Contaminated with Garbage Content

User Frustration with SERP Quality

Users are increasingly frustrated with the declining quality of SERPs. Searching for high-quality information online nowadays feels like going through a labyrinth of advertisements, thin affiliate content, and AI-generated material.

Click-through rates for informational queries have plummeted by 71% in just the last year as users find themselves unable to trust the results they’re seeing.”

One telling indicator is the habit of appending terms like “Reddit” to search queries, an attempt to find authentic, human-written content rather than accepting standard search results that consistently disappoint. This behavior reflects a lack of trust in traditional search, showcasing how users are creating their own filters to work around a system that no longer provides them with what they need.

AI-Generated Content Flooding the Web

As the frustration with SERP quality grows, there’s another shift that is silently happening. We’re already seeing more and more content being produced with AI, and that is only going to accelerate.

Large language models have made content creation so efficient and affordable that websites can now produce thousands of articles at minimal cost. Over half of all marketers now regularly use AI tools to generate long-form content, prioritizing production volume over depth and originality.

This exponential growth will alter the content ecosystem and, most importantly, the SERPs. The web was already struggling with poor-quality content saturation before AI came in, now that problem is only going to get worse.

Google’s Struggle to Filter Low-Quality Content

Despite implementing increasingly sophisticated algorithms, Google is fighting a losing battle against the flood of AI-generated and SEO-optimized content.

Google’s recent algorithm updates have specifically targeted websites reliant on AI-generated content, particularly penalizing those failing to demonstrate the famous E-E-A-T (experience, expertise, authoritativeness, and trustworthiness). They’ve addressed “scaled content abuse” (mass-producing low-effort content), “site reputation abuse” (leveraging trusted domains to host low-quality content), and the repurposing of expired domains as content farms.

Yet, despite the efforts, low-quality content continues to dominate many search results as even Google’s own AI tools struggle to consistently differentiate between high-quality and low-quality sources, sometimes citing content farms alongside more credible references.

2. The Cat-and-Mouse Game Between AI Content and Google

AI Content Detection Challenges

Detecting AI-generated content presents a formidable technical challenge that grows with each advancement in language models. Current detection tools produce high rates of false positives (incorrectly flagging human-written content as AI-generated) and false negatives (missing sophisticated AI outputs). Even essays written by humans before the advent of LLMs are sometimes being misclassified as AI-generated content

Algorithm Updates Lag Behind

Google’s algorithmic countermeasures have also been lagging behind the latest SEO tactics, hence the current state of the SERPs. AI-generated content brings this perpetual game of catch-up to another level.

The pattern has been consistent until today: Google identifies a specific manipulation tactic, develops countermeasures, and implements them through an algorithm update. However, as LLMs’ capabilities to create high-quality content continue to progress, the line between what constitutes “human” or “AI-generated” content will blur until we reach a point where there’s no real difference. LLMs are already capable of producing content that is almost indistinguishable from human content, but you need very good prompting abilities to achieve it.

My point is that Google’s ability to identify what AI-generated content is and being able to filter it well, hinges on retaining technical supremacy over all other LLM providers, which is an impossible task, hence, the future of SERPs is under a serious question.

This also explains why Google is not against AI content per se; they advise against it only when it doesn’t serve the user, and that’s understandable – they don’t have any other options.

3. AI Agents and LLMs Becoming the Primary Information Access Point

User Behavior Shift

As traditional search quality declines, users are pivoting to alternative information sources.

Traditional search volume is projected to decline by 25% over the next year as users increasingly turn to AI systems like ChatGPT, Claude, or Google Gemini. AI-powered features are already gaining prominence within traditional search interfaces. AI Overviews now appear in nearly 40% of Google searches, up from just 25% less than a year ago.

Sites that manage to be cited by AI systems receive substantial traffic, while those that fail to be included in AI-generated summaries face existential threats to their business models. This trend is accelerating as users discover the convenience and efficiency of AI-mediated information access.

Supreme user experience

Traditional search engines were designed around keyword/topic matching. They excel at finding pages that match the topic/keyword of the searcher but often fail to understand the nuanced intent behind complex queries.

AI agents, by contrast, are built to comprehend natural language and contextual nuance, making them significantly more effective for conversational information seeking.

Users can now ask complex questions that include multiple parameters, contextual elements, and nuanced considerations.

For example, an AI agent can effectively handle a query like “What’s better for a family with kids under three and a dog, Bryce Canyon or Arches National Park?”- a question that would typically produce disappointing results in a traditional search.

This conversational approach aligns more naturally with how humans think and communicate, removing the artificial constraints of keyword searching.

If there’s an important thing we’ve learned from the last few decades of technological advancement, it is that products providing supreme user experience almost always win. In our case, the difference between the user experience of using AI systems vs doing a Google search and digging into the SERPs can’t be bigger.

SERPs are going away and are bound to be replaced by AI systems, it’s only a matter of time.

LLMs as Content Curators

There’s an additional layer that we have to discuss. Unlike SERPs, which simply provide links to potentially relevant pages or AI overviews that answer informational queries, AI agents can synthesize information from multiple sources, offering direct answers and comprehensive summaries in a personalized way.

This capability directly addresses a major pain point associated with traditional search: the need to manually navigate through numerous pages and piece together information and the lack of personalization.

AI systems excel at processing and distilling complex information. They can take a lengthy research paper and generate a concise summary of its key findings or synthesize information from multiple articles to provide a holistic answer to a complex question and achieve all of this in a personalized way.

It’s no longer about what everyone is searching but what YOU are searching and how that information is serving YOU.

This ability to synthesize information is already transforming how users interact with online content, offering a level of convenience that traditional search can‘t match.

The future points toward an “agent-centric information access” model where specialized “knowledge agents” delivering tailored responses will become the primary information interface.

This vision of millions of personalized AI agents evolving with user needs represents a departure from static SERP rankings and a reinvention of how we access information online.

4. SEO 2.0: Optimizing for AI Systems

Websites as Data Repositories

As AI agents increasingly bypass SERPs to extract information directly from websites, the fundamental role of websites starts to evolve.

Rather than destination pages designed for human visitors, websites are slowly transforming into structured data repositories for AI systems to extract information and serve to users on different platforms.

The paper called “Agent-centric information access” shared below the article explores that same idea: websites in the future might be a place where we store information, but it is probably not far fetched to assume that eventually the habit of visiting websites might go away. After all, following the logic of supreme user experience, why would you waste time visiting a website and trying to navigate complex architecture when you can just ask your personal AI system for the answer?

45 years of development of UX/UI practices have not been able to solve the fundamental problem of “Can I find the information I need in less than 30 sec?” but AI systems are already solving that same problem.

It’s not far-fetched to predict that behaviors such as visiting a website might become a thing of the past. LLMs will be the main point of access, with websites having higher importance in cases of shopping where visual input will be highly needed.

SEO 1.0 vs. SEO 2.0

This is how SEO fundamentals are evolving:

DimensionSEO 1.0 (Traditional)SEO 2.0 (AI-Centric)
Primary GoalRank highly in SERPsBe cited by AI system
Key MetricsRankings, CTR, trafficExtraction rate, citation frequency
Content FocusKeywords, human readabilityStructured data, machine parseability
Technical PriorityCrawlability, indexabilityData accessibility, fast processing
Authority SignalsBacklinks, domain ageExpertise signals, cross-platform presence
User JourneyClick → Read → Return to SERPAsk → Receive direct answer
Platform FocusGoogle-centricMulti-platform, agent ecosystem
Content VolumeMore content = more rankingsFocused expertise > quantity

The Path Forward: Adaptation

The convergence of evidence is clear: we’re witnessing the decline of traditional search as the primary gateway to online information. The combination of deteriorating SERP quality and the rise of more capable AI systems marks a fundamental shift in how information is discovered, accessed, and consumed.

The shift is already happening, but it likely won’t reach the point of SEO 2.0 in 2-3 years. More time might be needed.

As you’re reading this we’re at a convergence point that we could call SEO 1.5 in which we have to prioritise doing traditional SEO as this is still where most of our searches happen but also prepare ourselves for the world of tomororow, one in which our primary user will be AI systems and not humans.

Notes:
https://www.researchgate.net/publication/379208332_Is_Google_Getting_Worse_A_Longitudinal_Investigation_of_SEO_Spam_in_Search_Engines
https://www.alixpartners.com/insights/102jze5/the-future-of-search-ai-driven-disruption-and-diversification/
https://dl.acm.org/doi/10.1007/978-3-031-56063-7_4
https://lawlibguides.sandiego.edu/c.php?g=1443311&p=10721367
https://seo.ai/blog/are-google-search-results-getting-worse-study
https://arxiv.org/html/2502.19298v1
https://theacsi.org/wp-content/uploads/2024/07/24jul_search-social-media_study.pdf


Leave a comment

Your email address will not be published. Required fields are marked *