AI meets the web: Technical SEO strikes back
12 January 2026·
Billy Cheung
In the early days of the web, technical SEO was how content became visible to search engines.
Then, over the last decade, SEO became content-driven and experience-led. "Write for humans, not search engines."
Higher rankings were awarded for better user experience, longer time spent on result pages, and content of higher relevance, credibility, and quality - to name but a few.
(Or rather, a website's ranking was lowered or penalised for not complying with those best practices.)
And now, the playbook for gaining impression and awareness is being rewritten again.
With the rise of ChatGPT and other AI-powered models, a fundamental assumption behind SEO is breaking: humans are no longer the only readers.
What has changed?
In traditional SEO, the flow was simple:
A human searched for something.
↓
The search engine presented ranked results.
↓
The human clicked, visited, and (hopefully) stayed on a website.
Visibility, ranking, and engagement all depended on how well a website performed for humans using a browser.
But with AI agents, that sequence is changing.
Now, AI stands in the middle.
When a human asks a question, the AI first checks with its training data. If it needs fresh information, it does the search and fetches content from the website - not by opening a link in a browser like a human would, but by parsing the structured data, APIs, and context files.
There's no scrolling, no clicking, no time on page. The AI reads, interprets, and synthesises what it finds - often without the user ever visiting your site.
That shift explains why so many websites - from e-commerce platforms to news outlets, forums, and even Wikipedia - are seeing a decline in traffic and engagement metrics.
It's not that content is less visible. It's that humans are no longer the ones doing the viewing.
Why have metrics been affected?
Website analytics tools have always relied on client-side tracking - JavaScript snippets and cookies that record events when someone interacts through a browser.
Page views, clicks, session durations, conversions - all triggered by human activity.
But when an AI agent reads a webpage, it doesn't load it in a browser or execute client-side code. It simply parses the raw HTML and scripts to extract the information it needs from a given prompt.
So, while your content might be read - and even quoted elsewhere - the interaction isn't captured by Google Analytics or most analytics setups, at least not easily or by default.
(Some of the possible workarounds will be covered later in this article.)
In simple terms: AI can "consume" your content without leaving a trace in your metrics.
This doesn't mean the content is losing relevance - it simply means that visibility now extends beyond traditional analytics. It's time to view through the lens of AI and think about machine engagement alongside human engagement.
How AI reads a website
AI doesn't "visit" a website the way humans do. Again, it parses the underlying code that renders the page.
Here are some key components - familiar, perhaps forgotten, but now more relevant than ever if content is to be truly accessible to AI.
Metadata
Metadata helps AI (and search engines) understand the basic context of a webpage.
Structured Data / JSON-LD
Schema markup describes what the content is, not just what it says.
sitemap.xml
A sitemap guides crawlers (and now AI agents) through your website's structure.
robots.txt
The robots.txt file tells crawlers (previously search engines, and now AI) what they can or can't access.
The new player: Model Context Protocol (MCP)
Here's where the new meets the old.
MCP is an emerging standard that defines how AI models communicate directly with web data sources.
Instead of crawling, AI can query a website's data in a structured way - this is essentially the API for AI.
A simplified pseudocode illustration:
Other emerging practices
- Feeds & APIs: Public APIs or Atom/RSS feeds are still valuable for discoverability.
- Other access files: New
.well-knownfiles (like.well-known/mcp) and.txtfiles (llms.txtandagents.txt) may become part of the technical SEO toolkit. - Performance: Page speed and accessibility still matter - if AI struggles to parse a website, it may be data entirely.
In short, the mechanics are familiar. Apart from MCP and future standards that may come, what we're really doing is rediscovering the old technical SEO fundamentals - with a 2025 upgrade.
The SEO landscape in the AI Era
Some are calling it AEO (Answer Engine Optimisation). But, stripping away the buzzwords, it's still the same foundation:
Quality content, clear structure, and accessible data.
The key difference is that machines are now part of the audience.
They're intermediaries - not replacing humans, but interpreting on their behalf.
To stay future-proof:
- Maintain the same focus on content quality.
- Re-invest in technical SEO to ensure content is machine-readable.
- Align teams: marketers, web developers, and data engineers all play a role in how visible information becomes to AI.
Bonus: Redefining reach and metrics
The question isn't "How do we increase traffic?" - it's "How do we measure visibility in an AI-mediated world?"
When AI agents read your content, they often don't trigger analytics. But you can still capture signals through server-side logic and new instrumentation.
Complement client-side analytics with server-side tracking
This intercepts traffic and determine whether AI is requesting data - even when no JavaScript runs.
Integrate tracking with the MCP server
This way, it will send signals onward when AI agents are consuming structured content.
Redefine your visibility metrics
Traditionally, there are:
- Page views
- Bounce rate
- Average session duration
Now, it may be worth add machine-specific metrics, like:
- AI reads - when an agent requests or parses your content.
- Machine referrals - mentions or citations by AI platforms.
- API usage rate - how often your structured data endpoints are hit.
- Hybrid reach - humans + machines combined.
None of them are formalised or industrial standards - but experimenting early helps stay ahead of the next analytics wave.
While no one can predict how AI will evolve, one thing remains clear: data accessibility and quality are the cornerstones of visibility.
Everything else - algorithms, rankings, metrics - will keep shifting.
But the principles don't change: make your content understandable, structured, and accessible.
The rest is the same as it's always been: stay curious, keep learning, and stay relevant.
