March 28  SEONews

Meet LLMs.txt, a proposed standard for AI website content crawling – Search Engine Land

To meet the web content crawlability and indexability needs of large language models, a new standards proposal for AI/LLMs by Australian technologist Jeremy Howard is here.

His proposed llms.txt acts somewhat similarly to robots.txt and XML sitemaps protocols, in order to allow for a crawling and readability of entire websites, putting less of a resource strain on LLMs for crawling and discovering your website content.

But it also offers an additional benefit – full content flattening – and this may be a good thing for brands and content creators.

While many content creators are interested in the proposal’s potential merits, it also has detractors.

But given the rapidly changing landscape for content produced in a world of artificial intelligence, llms.txt is certainly worth discussing.

The new proposed standard for AI accessibility to website content

Bluesky CEO Jay Graber propelled the discussion of content creator rights and data control, as it relates to being used for training…

Read Full Story: https://news.google.com/rss/articles/CBMib0FVX3lxTE0tSHNqM1VhQWZORFpWUkZqaEtOX0ZoXzhNRDA1NFBIZU9STFJCRV9EMTdWRFZ0bmI5MWJibi1ISHg1S3NaVmpfcERieUppbTZYdFZRZzE0TE5wUmRwWW5sZ25lSFRiSVI2NllNa251OA?oc=5

The post Meet LLMs.txt, a proposed standard for AI website content crawling – Search Engine Land first appeared on One SEO Company News.



source: https://news.oneseocompany.com/2025/03/28/meet-llmstxt-a-proposed-standard-for-ai-website-content-crawling-search-engine-land_2025032860567.html

Your content is great. However, if any of the content contained herein violates any rights of yours, including those of copyright, please contact us immediately by e-mail at media[@]kissrpr.com.