What is Answer Engine Optimization (AEO)? AEO is the practice of structuring your website so that AI agents (Claude, ChatGPT, Perplexity, Gemini) can read, understand, and cite your content when answering user queries. Where SEO was about ranking in a list of blue links, AEO is about being the answer itself.
Okay, “SEO is dead” is a little dramatic. Google still drives traffic. Backlinks still matter. But here’s the uncomfortable truth: the way people find information is undergoing the fastest structural shift since the invention of the search box, and most businesses haven’t started optimizing for the new paradigm.
Third-party research from Semrush and Similarweb shows that roughly 60% of Google searches are now zero-click. Users ask a question, an AI Overview synthesizes an answer, and they never visit a single website. The ten blue links still exist. Most people just don’t scroll to them anymore.
Meanwhile, a generation of users has formed a new habit: they ask Claude, ChatGPT, or Perplexity directly. No SERP. No scrolling. No clicking. Just an answer, and somewhere in that answer, a citation. The question is whether that citation is your site or your competitor’s.
This is the AEO opportunity. And right now, most businesses haven’t started taking it seriously.
How Does Claude Actually “See” a Website?
Understanding how AI agents read the web is the foundation of everything that follows. It’s fundamentally different from how Google works, and conflating the two leads to a completely wrong optimization strategy.
How Google crawls: Googlebot fetches your HTML, executes JavaScript (Google is one of the very few crawlers that bothers), indexes your words, analyzes your links, and then runs everything through a ranking algorithm with hundreds of signals. It is, essentially, a massive relevance-sorting machine for a keyword query.
How Claude reads: When you search on Claude.ai, Claude queries Brave Search’s index as its backend. It receives top results and their text, then synthesizes an answer across those sources. Separately, Anthropic operates several crawlers: ClaudeBot gathers training data, Claude-SearchBot indexes content to improve search result quality, and Claude-User fetches pages when individual users direct Claude to browse specific URLs. These crawlers do not run JavaScript. They read raw HTML. If your content only exists after JavaScript renders it, it’s invisible to them.
The critical distinction is this: Google sorts pages by relevance. Claude reads pages for comprehension. It’s not trying to decide if your page is more relevant than 10 million other pages for “best project management software.” It’s trying to extract a coherent, citable answer to give a user. The signal it optimizes for isn’t keywords or backlinks. It’s clarity, structure, and extractability.
| Signal | Google SEO | Claude / AEO |
|---|---|---|
| Primary goal | Rank pages by relevance | Extract citable answers |
| JavaScript | Renders it | Does not execute it |
| Backlinks | Critical ranking signal | Minimal relevance |
| Keywords | Core signal | Understands semantics, not keywords |
| Content structure | Helpful | Critical |
| Schema markup | Moderate value | High value |
| Content depth | Values comprehensiveness | Values precision and directness |
| Update frequency | Rewards freshness | Rewards clarity |
The implications of this table are significant. You can have zero backlinks and still get cited by Claude, if your content clearly answers the question. And you can rank #1 on Google and never appear in an AI answer, if your content is buried in JavaScript or structured for keyword density rather than comprehension.
How to Structure Your Site for Agent Visibility
1. Fix Your robots.txt First
Before anything else, make sure AI crawlers can actually reach you. An alarming number of sites block these bots accidentally through overly aggressive robots.txt configurations.
Add explicit allow rules for the major AI user agents:
# Anthropic (Claude)
User-agent: ClaudeBot
Allow: /
User-agent: Claude-SearchBot
Allow: /
User-agent: Claude-User
Allow: /
# OpenAI (ChatGPT)
User-agent: GPTBot
Allow: /
# Perplexity
User-agent: PerplexityBot
Allow: /A note on Google-Extended: this is Google’s user agent token for controlling whether your content is used to train Gemini and Vertex AI models. It does not affect your Google Search rankings or visibility. Allowing it is a separate decision from general SEO, and many publishers deliberately block it. If you’re comfortable with Google using your content for AI training, add Allow: / for Google-Extended as well. If not, you can block it without any impact on your search rankings.
If you block the other bots listed above, you are making yourself harder to find in the AI systems where your customers increasingly start their research.
2. Add an llms.txt File
llms.txt is an emerging standard proposed by Jeremy Howard of Answer.AI in September 2024. Think of it as a curated table of contents for AI systems. Where robots.txt tells crawlers where not to go, llms.txt tells AI agents what your site is about and which pages matter most.
Place it at yourdomain.com/llms.txt. The format is simple markdown with links:
# Your Company Name
> Brief description of what you do.
## Core Content
- [About](https://yoursite.com/about): Who we are
- [Services](https://yoursite.com/services): What we offer
- [Blog](https://yoursite.com/blog): Our writingTo be transparent about the current state of this standard: as of early 2026, no major LLM provider (including Anthropic, OpenAI, or Google) has officially confirmed that their systems read llms.txt during inference. Google has been openly skeptical. But the standard has real grassroots traction, with hundreds of thousands of sites implementing it, including the documentation sites of Anthropic, Cloudflare, and Stripe (typically auto-generated through their docs platform, Mintlify). The cost of creating this file is about 15 minutes of work, and if adoption continues on its current trajectory, you’ll be glad you did it early.
3. Serve Content in Static HTML
Remember: AI crawlers don’t run JavaScript. If your page content loads dynamically after the initial HTML response, crawlers see an empty shell.
The test is simple: open your browser’s developer tools, disable JavaScript, and reload the page. If your content disappears, AI agents can’t read it.
If you’re on a static site generator or framework with static export (Next.js, Astro, Hugo, etc.), you’re already in good shape. Keep it that way. Any new dynamic features should still render critical text content in the initial HTML.
4. Structure Content Around Questions, Not Keywords
Google rewarded you for knowing that someone searching “CRM software” was looking for a list of tools. Claude rewards you for directly and clearly answering the question “What CRM software is best for a 10-person sales team, and why?”
Practically, this means:
Write headers as questions. “What is AEO?” performs better than “AEO Overview.” The question matches how people actually prompt AI systems, and it makes your content directly extractable as an answer.
Lead with the answer, then explain. Traditional SEO content buried the answer to keep readers on page longer. AI agents reward the opposite pattern. State your conclusion clearly in the first paragraph, then support it. This is sometimes called “inverted pyramid” writing and it directly maps to how LLMs extract citable content.
Define your terms. AI systems build knowledge graphs. If your page clearly defines “what AEO is” at the top, that definition can be cited in any number of downstream queries.
5. Use Schema Markup
Schema markup is structured data embedded in your HTML that explicitly tells crawlers what type of content they’re reading. It’s been part of SEO for years but is increasingly critical for AI comprehension.
Key schema types for AEO:
FAQPage: Mark up Q&A content and AI systems can extract individual answers directly. Article: Establishes author, publication date, and content context. HowTo: Step-by-step content that AI can extract and reformat for users. Organization: Tells AI systems who you are, what you do, and how to find you.
How AI-ready is
your business?
10 questions across strategy, content, discovery, and operations. Get your score and identify your 3 biggest opportunities.
Free AI ScorecardThe “Research Paper” Hack: Why Charts and Data Beat Blog Posts
Here’s the most underused content strategy in AEO: publishing substantive, data-rich content that looks more like a research finding than a blog post.
Think about how Claude generates answers. When a user asks a complex question, Claude synthesizes across multiple sources and tries to give a credible, cited response. It heavily favors content that reads as authoritative and specific. A post titled “5 Tips for B2B Lead Generation” competes with ten thousand identical posts. A post titled “B2B Lead Conversion Rates by Industry: An Analysis of 847 Small Businesses” gets cited because it’s a primary source.
The strategies that work:
Publish original data. Survey your customers. Analyze your own metrics. Publish your findings. Even a small data set (n=50) is more citable than another opinion piece. AI systems are voracious for statistics and will cite them repeatedly.
Create reference tables and comparison frameworks. The table above comparing SEO vs. AEO signals is exactly the kind of content AI agents extract and cite. It’s structured, clear, and directly answers a question.
Write “state of” or “benchmark” posts. “The State of Ops Hiring in 2026” or “Benchmark: What Does a Series B Operations Team Actually Look Like?” These titles signal primary research and get cited accordingly.
Define things nobody has clearly defined. If there’s a term in your industry that everyone uses but nobody has pinned down precisely, write the definitive definition. AI systems will cite you every time anyone asks about it.
The underlying insight is that AI agents are trained to cite sources that make their answers more credible and specific. Vague opinion content doesn’t serve that function. Precise, structured, data-backed content does.
Applying This Strategy: A Note on This Post Itself
This post was written to practice what it preaches. If you look at its structure, you’ll notice a few deliberate choices:
The definition of AEO appears in the opening callout in the format AI agents prefer: “What is X? X is Y.” This is directly extractable as a citation.
The comparison table is structured to be read without context. A crawler that fetches only that table has everything it needs to understand the signal differences between Google SEO and Claude/AEO.
Headers are questions. “How Does Claude Actually See a Website?” matches the natural language query someone would type into Claude. “What is AEO?” is both a heading and a Q&A pair.
Content leads with answers. Each section states its conclusion first and then explains. An AI agent extracting a snippet from the top of any section will get a usable answer.
The post defines its primary term in the opening and uses it consistently throughout, reinforcing the definition that AI systems can associate with a growing cluster of related queries.
None of this required sacrificing readability for humans. That’s the real point: AEO and good writing are largely the same thing. Write clearly. Answer questions directly. Structure your ideas so they’re findable. The main audience has changed (increasingly it’s an AI agent deciding whether to cite you) but the underlying craft is what it always was.
Quick Implementation Checklist
- Update
robots.txtto explicitly allow ClaudeBot, GPTBot, and PerplexityBot - Create
llms.txtat your domain root with a brief description and your key pages - Verify your content renders in static HTML (disable JS and check)
- Add
FAQPageorArticleschema to your most important pages - Rewrite at least one content page so its first paragraph directly answers the question its title poses
That’s it. You’re ahead of most of the internet.
Related reading: How Do You Measure AEO?, the measurement playbook for tracking citation frequency, AI referral traffic, and share of voice. | The AI Canon, a curated reading list for understanding the current state of large language models and what they mean for business.