What if tomorrow's SEO goes through llms.txt?

Today, it's becoming crucial to precisely indicate to AIs which content on our site deserves their attention, thanks to two still little-known files: llms.txt and llms-full.txt.

For a few months now, I've been using them on my web projects. Their goal is simple: guide language models (LLMs) like those from ChatGPT or Gemini toward the content I really want to highlight.

Where robots.txt serves to exclude certain pages, llms.txt does exactly the opposite:

Here's what I want you to read, understand, analyze.

It's still a marginal approach, but it could well change the game, at a time when AIs generate answers from web content without necessarily redirecting to sources.

How does llms.txt work?

llms.txt is a file placed at the root of the site, which explicitly lists the pages and content to highlight to LLMs. You can also create an llms-full.txt containing complete or partial versions of the content, in Markdown, for more precision.

Why has this approach become strategic?

With the arrival of Google's AI Mode (May 2025), search logic is evolving. We're no longer just talking about positioning in classic results, but about generated answers directly in the search interface.

The operation relies on a principle of intelligent synthesis. Google no longer just displays links, it produces summaries, cites excerpts, and all from content it deems relevant. This implies two things:

  1. A large part of potential traffic moves to these AI answers.
  2. Content must be structured to be readable and usable by AIs.

In this context, llms.txt plays a key role. It serves to clearly declare what deserves to be read, understood, or even cited.

Concretely, how does it work?

I've set up two files at the root of my blog:

  • llms.txt contains a selected list of important pages: articles, case studies, projects.
  • llms-full.txt contains, for each listed page, a Markdown version of the content (partial or complete text). This allows providing more context and structure to AIs.

Here's an example of what I declare in my llms.txt:

llms.txt
# Blog
- [Building a Composable Audio Player: My Journey in Web Audio Development](/posts/construire-un-lecteur-audio-composable)
- [What if tomorrow's SEO goes through llms.txt?](/posts/et-si-le-referencement-de-demain-passait-par-llms-txt)
- [UX Case Study: Analysis of the SociumJob Platform](/posts/etude-de-cas-ux-analyse-de-la-plateforme-socium-job)

This format allows AIs to analyze well-structured content, designed for synthesis.

What are the concrete benefits of this approach?

  1. Increased visibility in AI answers: AIs like Gemini, Perplexity, or ChatGPT are increasingly using external sources. By offering a clear entry point, we increase the probability of appearing in a generated answer. It's a form of alternative SEO, oriented toward citation rather than ranking.
  2. Anticipation of new SEO: SEO is no longer limited to being well positioned in classic SERPs. It extends to AI results, where content clarity and structure are paramount. A well-designed llms.txt file acts as an editorial map dedicated to AI engines.
  3. Simple implementation, without technical dependency: No plugin, no API is needed. Just create two static files, accessible at the root of the site. This gives total control over what we want to expose to AIs.
  4. Structuring editorial approach: To create a good llms.txt, you need to make the effort to select, prioritize, and document your content. This forces you to think about what you want to convey, the added value of content, and their readability. An excellent habit.

Where is adoption today?

There is no official recognition of llms.txt by Google yet. John Mueller, SEO spokesperson at Google, recently indicated that it was "an interesting experiment", comparable to meta keywords tags in their early days.

But some tools are already interested. Mintlify, FastHTML, or even Yoast are exploring uses around these files. Open source initiatives, like LangChain, even encourage their implementation in documented or technical sites.

Conclusion

llms.txt guarantees nothing in the short term. But it's a good practice: explicit, simple, proactive. It allows guiding AI reading, better structuring content, and preparing the future of semantic SEO.

I'm continuing my experiments and documenting what I observe. If these files become standards in the coming months, we might as well master them now.

What if tomorrow's SEO goes through llms.txt? - Lucien Loua - UX/UI Developer