Quick answer
Create an llms.txt file in your website's root directory to provide AI systems with a structured summary of your site, enhancing their understanding and visibility.
The llms.txt file was proposed by Jeremy Howard , co-founder of Answer.AI, in September 2024. The problem he was solving is one I keep running into when building GEO strategies for B2B clients: AI engines struggle to understand websites. Context windows are limited, HTML is noisy, and most web pages are designed for humans, not machines. An llms.txt file gives AI a clean entry point.
I’ll be honest: this is still a proposed standard. No major AI provider has officially committed to using llms.txt files for ranking or citations. But the trajectory matters. Gartner predicts traditional search volume will drop 25% by 2026 as users shift to AI-powered answer engines (Gartner ). The companies preparing for that shift now, even with imperfect tools, will be better positioned than those waiting for a finished standard.
How an llms.txt file differs from robots.txt
The simplest way to understand an llms.txt file is to compare it to files you probably already have on your website: robots.txt and sitemap.xml. They all live in your root directory, but they do completely different jobs.
robots.txt is a gatekeeper. It tells search engine crawlers what NOT to access. “Don’t crawl this staging area. Don’t index this admin page.” It uses a directive format (User-agent, Disallow, Allow) and has been a web standard since the mid-1990s.
sitemap.xml is a map. It lists every URL on your site so search engines can discover pages efficiently. It tells crawlers where things are, but says nothing about what those pages contain or why they matter.
llms.txt is a briefing document. It tells AI systems what TO read and why it is important. Instead of blocking or listing, it provides context: “Here is what this company does, here are the key pages, and here is how they connect.” It uses Markdown, not XML or directives, because large language models parse Markdown far more efficiently than HTML.
The critical difference: robots.txt and sitemaps were designed for traditional search engine indexing. llms.txt was designed for AI comprehension. They are complementary, not competing.
| File | Purpose | Format | Designed for |
|---|---|---|---|
| robots.txt | Controls crawler access | Directives (Allow/Disallow) | Search engine bots |
| sitemap.xml | Lists URLs for discovery | XML | Search engine indexers |
| llms.txt | Provides structured context | Markdown | Large language models |
If you already have robots.txt and a sitemap, an llms.txt file is the third piece: not replacing what you have, but adding a layer specifically for AI consumption.
How llms.txt helps AI engines find and cite your content
It helps with AI search optimisation by giving language models a structured, low-noise version of your website content. Instead of parsing thousands of HTML pages, menus, footers, and cookie banners, an AI system can read a single Markdown file that explains who you are, what you do, and where your best content lives.
The shift towards AI search is already measurable. AI Overviews now appear in 25.11% of Google searches, up from 13.14% in March 2025 (Conductor ). That means a quarter of all Google results now include an AI-generated answer at the top of the page. If your content is not structured for AI consumption, you are missing a growing share of visibility.
The conversion data makes the case even stronger. AI referral traffic converts at twice the rate of traditional organic traffic (Conductor ). The visitors who arrive via AI search already have a clear intent and a specific question answered, so they are further along the buying journey when they land on your site.
I see this pattern across the B2B tech companies I work with. The ones investing in AI visibility now, even before standards are finalised, are the ones building an advantage. When I built a GEO strategy for Avalara , the llms.txt file was one small piece of a much larger approach that included onsite optimisation, offsite amplification, and content restructured for AI extractability. No single tactic wins on its own. But the companies treating AI readiness as a system, not a checkbox, are the ones seeing results.
Want to get your content cited by AI?
We build GEO strategies that make your website visible in AI search results.
Talk to usWhat an llms.txt file looks like in practice
The format follows a straightforward structure defined by the llmstxt.org specification . The file is written in Markdown and contains these sections, in order:
- H1 header: Your company or project name (the only required element)
- Blockquote: A short summary of what you do, containing the key information an AI needs to understand everything else in the file
- Content paragraphs: Additional context, such as your positioning, key differentiators, or important notes
- H2 sections with links: Grouped lists of your most important pages, each with a descriptive name, URL, and optional annotation
An optional “Optional” H2 section can contain secondary resources that AI systems can skip when working with limited context windows.
Here is what a B2B company’s llms.txt file might look like:
# Acme cloud solutions
> Acme Cloud Solutions helps mid-market companies migrate to
> Azure and build cloud-native applications. Based in London,
> serving clients across the UK and Europe since 2015.
Acme specialises in Azure infrastructure, application
modernisation, and managed cloud services. Key industries
include financial services, healthcare, and manufacturing.
## Services
- [Azure Migration](https://acme.example.com/services/azure-migration): End-to-end cloud migration planning and execution
- [Application Modernisation](https://acme.example.com/services/app-modernisation): Legacy application refactoring for cloud-native architectures
- [Managed Cloud](https://acme.example.com/services/managed-cloud): Ongoing management, monitoring, and optimisation
## Case studies
- [NHS Trust Cloud Migration](https://acme.example.com/work/nhs-trust): Migrated 200+ workloads to Azure in 6 months
- [FinCo Data Platform](https://acme.example.com/work/finco): Built a real-time analytics platform processing 2M transactions daily
## Resources
- [Cloud Migration Guide](https://acme.example.com/resources/migration-guide): Step-by-step guide for planning an Azure migration
- [Blog](https://acme.example.com/blog): Latest thinking on cloud strategy and AI
## Optional
- [About Us](https://acme.example.com/about): Team, history, and credentials
- [Contact](https://acme.example.com/contact): Get in touch
The file is concise: 10-20 links, not hundreds. Each link has a clear description explaining what the page contains. The structure follows a logical hierarchy: who you are, what you do, proof it works, resources for learning more. That structure is deliberate. You are curating what AI systems see, not dumping your entire sitemap.
For most B2B companies, focus on your highest-value pages: services, case studies with measurable outcomes, key blog posts that demonstrate expertise, and any technical documentation. Leave out generic pages like privacy policies, team bios, and cookie notices.
llms.txt vs llms-full.txt: when to use each version
The llms.txt standard actually describes two files, and knowing which one you need matters.
llms.txt is the summary. It contains your company description, key context, and links to important pages. AI systems read the file, understand what your site covers, and can follow individual links when they need deeper detail on a specific topic. Most websites should start here.
llms-full.txt is the complete picture. It compiles all of your site’s important content into a single Markdown file. Instead of links that AI needs to follow, the full version contains the actual text of each page inline. An AI system can paste this single URL into its context window and immediately access everything.
The trade-offs are straightforward:
- llms.txt is easier to maintain, smaller in file size, and gives you more control over what AI systems prioritise. For a B2B company with 20-50 important pages, the summary version keeps things focused.
- llms-full.txt is more useful for developer documentation, API references, and technical products where an AI needs to understand interconnected concepts. If a developer asks Claude “how do I authenticate with the Acme API?”, having the full authentication docs available in one file means a more accurate answer.
There is also a related convention: providing Markdown versions of individual pages by appending .md to the URL (e.g., yoursite.com/services/azure-migration.md). Some developer documentation platforms support this natively. For most marketing sites, this is not practical to implement and not worth the effort yet.
My honest recommendation for most B2B companies: start with llms.txt. It takes 30 minutes to create, covers the essentials, and is easy to keep updated. Add llms-full.txt only if you have substantial technical documentation or API references where inline content genuinely helps AI systems give better answers. Do not overcomplicate it.
How to create an llms.txt file for your website
Creating an llms.txt file takes less time than most people expect. The process involves four steps: auditing your content, writing the file, uploading it, and keeping it maintained.
1. Audit your content
Start by listing the 10-20 pages that best represent your business. For a B2B company, that typically means:
- Your core service or product pages
- 2-5 case studies with quantified outcomes
- Your most authoritative blog posts or guides
- Technical documentation or API references (if applicable)
- Your about page (for company context)
The goal is curation, not comprehensiveness. You are deciding what an AI system should prioritise. A tightly focused llms.txt with 15 links will serve you better than one with 200.
2. Write the file in Markdown
Follow the spec from llmstxt.org :
- Start with an H1 containing your company name
- Add a blockquote summarising what you do in 2-3 sentences
- Optionally add a paragraph with additional context (positioning, key differentiators, industries)
- Create H2 sections grouping your links by category (Services, Case Studies, Resources)
- Each link should include a descriptive name, the URL, and a colon followed by a brief annotation
3. Upload to your root directory
Save the file as llms.txt and upload it to your website’s root directory, so it is accessible at yoursite.com/llms.txt. Most hosting platforms and CMS tools make this straightforward. If you use WordPress, you can upload via the file manager or an FTP client. If your site is built on a static site generator, add the file to your public directory.
4. Verify and maintain
Check that the file loads correctly by visiting the URL in your browser. Then keep it updated. When you publish a significant new case study, add a major service page, or retire outdated content, update your llms.txt file. Treat it like you would a sitemap: set a reminder to review it quarterly.
If you prefer not to write the file manually, generator tools exist. Rankability’s llms.txt generator can create a spec-compliant file from your site. CMS plugins for VitePress, Docusaurus, and Drupal can auto-generate the file from your existing content structure.
So what is an llms.txt file, and should you create one?
An llms.txt file is a Markdown summary of your website, placed in your root directory, that gives AI systems a clean and structured way to understand your content. Proposed by Jeremy Howard in September 2024, it addresses a real problem: AI engines struggle to make sense of complex HTML pages, and a focused summary helps them represent your business accurately.
Should you create one? For most B2B companies, yes. It takes under an hour to implement, costs nothing, and positions your site for a shift that is already underway. The key points:
- llms.txt is a proposed standard that helps AI systems understand your website without parsing complex HTML
- It complements robots.txt and sitemaps rather than replacing them, adding an AI-specific layer
- Current adoption sits around 10%, with early movers concentrated in tech, developer tools, and SaaS
- No AI provider officially uses it for ranking yet, but companies like Anthropic, Cloudflare, and Stripe have implemented one, signalling where the standard is heading
- It should be part of a broader GEO strategy, not a standalone tactic, covering onsite structure, offsite engagement, and content designed for AI extractability
The companies I work with across B2B tech, cloud, and cybersecurity are all grappling with the same question: how do we show up in AI search? An llms.txt file is not the whole answer. But it is a practical first step, and the earlier you implement it, the more prepared you will be when AI engines start paying closer attention.
If you want help making your content visible in AI search, take a look at how we approach it .
