LLMs.txt: Why It’s Not Useful Right Now
- Aizaz Ahsan
- Apr 19
- 1 min read
LLMs.txt is a proposed file meant to show AI bots the main content of a webpage without ads or navigation. It’s written in markdown format and is often misunderstood as a replacement for robots.txt, but that’s incorrect. Robots.txt controls crawling, while LLMs.txt is just about displaying content more clearly to AI.
However, Google’s John Mueller recently compared LLMs.txt to the outdated keywords meta tag, saying it’s not effective. He explained that major AI services like Google, OpenAI, and Anthropic aren’t using or even checking for LLMs.txt files. Even large website owners hosting thousands of domains reported that no AI bots are picking up the file.
This brings up a few issues: if AI bots still have to check the full website for accuracy, what’s the point of a separate file? Also, it can be misused; someone could show one version of content to AI bots and another to real users, which leads to cloaking and spam.
So, in short, LLMs.txt is not a standard, not in use by major platforms, and adds no real value at the moment. Experts suggest focusing on structured data, robots.txt, and proper sitemaps instead.
Comments