llms.txt: The New robots.txt for AI — Complete Guide
Learn what llms.txt is, why your website needs one, and how to create it. The essential guide to making your site visible to AI assistants.
Just like robots.txt tells search engines how to crawl your site, llms.txt tells AI assistants what your site is about and how to reference it.
What is llms.txt?
llms.txt is a plain text file placed at the root of your website (e.g., yoursite.com/llms.txt) that provides structured information for Large Language Models (LLMs) like ChatGPT, Claude, and Perplexity.
Why Do You Need One?
AI assistants are becoming major traffic sources. When someone asks ChatGPT "what's the best tool for X?", the AI draws from its training data and any accessible information about your site. An llms.txt file ensures the AI has accurate, up-to-date information about:
- What your business does
- What products or services you offer
- Your unique value proposition
- Key pages and content
How to Create llms.txt
Create a file called llms.txt in your website's root directory with this structure:
# Your Site Name
> https://yoursite.com
## What is [Your Site]?
A brief description of what your site/business does.
## Key Features
- Feature 1
- Feature 2
- Feature 3
## Who is it for?
Describe your target audience.
Check Your AI Visibility
After adding llms.txt, check your AI Visibility Score on SiteGlint to see if AI assistants are more likely to recommend your site.
Check your site's SEO & AI Visibility
Free analysis in 15 seconds. No account required.
Analyze your domain