If you've spent any time in SEO circles lately, you've seen people talking about llms.txt. There's a lot of confusion — people call it llm.txt, llms.txt, llm.text, and the file is real but the hype around it is getting ahead of the facts.
Here's what it actually is, what it does, and the honest answer on whether it's worth your time right now.
What is llms.txt?
llms.txt is a proposed standard, created by Jeremy Howard at fast.ai. The idea: put a plain-text Markdown file at the root of your website (/llms.txt) that tells AI crawlers which pages on your site matter most.
Think of it as robots.txt for the AI era. Robots.txt tells search engine crawlers what to ignore. llms.txt tells AI systems what to pay attention to.
The problem it's solving is real. When AI crawlers visit a website, they see navigation menus, cookie banners, JavaScript-rendered content, sidebars, footers — a mess of HTML that's optimized for human readers, not for machines trying to extract useful information. llms.txt gives AI a clean, prioritized list of what actually matters on your site.
What does it look like?
It's a simple Markdown file. A basic example:
# Is It Good AI > Reviews and comparisons of major AI models — Claude, ChatGPT, Gemini, and more. ## Best For - [Best LLM for Writing](/best-llm-for/writing): Our pick for writing tasks - [Best Free AI Chatbot](/best-for/free-web-chat): Top free options ranked ## Model Reviews - [All Models](/models): Browse every model we've reviewed - [Overall Rankings](/rankings/overall): Quality scores compared
That's it. H1 for your site name, a short description, then sections with links and brief descriptions of what they lead to. Valid Markdown, readable by both humans and machines.
Where does it go?
The root of your domain: https://yourdomain.com/llms.txt
Not in a subdirectory. Not at /docs/llms.txt. The root. Same convention as robots.txt and sitemap.xml.
There's also a companion convention: for any page on your site, you can provide a clean Markdown version at the same URL with .md appended. So yoursite.com/about becomes yoursite.com/about.md. This helps AI systems get clean, unformatted content without the HTML noise. It's optional but useful.
The honest truth: major crawlers aren't reading it yet
Here's what the hype doesn't mention. Semrush ran a test from mid-August to late October 2025 and found that Google-Extended (Google's AI crawler), GPTBot (OpenAI's crawler), PerplexityBot, and ClaudeBot were not reading the llms.txt file on their test site. Zero visits from any of them to the llms.txt page during that period.
As of early 2026, fewer than 1,000 domains have published an llms.txt file — a tiny fraction of the web. Adoption is mostly limited to developer-focused tools, SaaS documentation sites, and early adopters in the SEO space.
None of the major AI companies have publicly committed to supporting the standard. It remains a proposal, not a requirement.
So is it worth doing?
Yes — for a specific reason.
The crawlers aren't reading it today. But implementing it takes about 30 minutes and costs nothing. When — if — the standard gets traction and crawlers start respecting it, you'll already have it in place. It's the kind of low-effort, future-proofing move that makes sense when the downside risk is zero.
There's also one category where it genuinely works right now: developer tools. Cursor, Continue, and other AI-powered coding assistants actively look for and use llms.txt files. If you're maintaining a software project or documentation site, llms.txt can meaningfully improve how AI coding tools understand your project.
The confusion about the file name
You'll see it written multiple ways: llm.txt, llms.txt, llm.text. The official standard is llms.txt (with the “s”). The keyword searches are split between variations, which is why there's confusion.
Use llms.txt — that's the canonical name from llmstxt.org.
How to create one in 20 minutes
- Create a plain text file named
llms.txt - Add an H1 with your site or brand name
- Add a blockquote (
>) with a one-line description of what your site does - Add H2 sections for major content areas (products, documentation, blog, etc.)
- Under each section, list your most important pages as Markdown links with brief descriptions
- Upload it to your domain root so it's accessible at
yourdomain.com/llms.txt
If you're on Next.js, you can serve it as a static file in the /public folder or generate it dynamically from a route handler. Both work.
Bottom line
llms.txt is a real proposal solving a real problem — the mess of HTML that AI crawlers have to wade through to understand what's important on your site. The major crawlers aren't using it yet, which limits its immediate impact. But it's a 30-minute implementation with zero downside, and developer tools are already using it today.
If you have a documentation site or developer-facing product, do it now. If you have a general content site, put it on the list and come back to it in six months when adoption may have moved.