Complete Guide

LLMs.txt Curator — Free WordPress Plugin by SEO Strategy Ltd

LLMs.txt Curator is a free WordPress plugin that generates, curates and validates your llms.txt file — the emerging standard for telling AI systems which pages on your site are most important. Curation-first, quality-scored, and built by an SEO practitioner who needed it to exist.

9 min read 1,836 words Updated Apr 2026
LLMs.txt Curator Official Wordpress plugin by SEO Strategy Ltd

LLMs.txt Curator is a free WordPress plugin by Sean Mullins of SEO Strategy Ltd that generates, curates and validates your llms.txt file — the emerging standard for telling AI systems which pages on your site matter most. Unlike auto-generators that dump every URL into a flat list, LLMs.txt Curator takes a curation-first approach: you choose the pages, organise them into topical sections, write or approve descriptions, and see a quality score before the file goes live. The plugin is available at wordpress.org/plugins/llms-txt-curator/ and is free under the GPL v2 licence.

90%+ quality score target — the Coverage Report in every generated file shows what percentage of pages have descriptions; files below 50% produce noise not signal LLMs.txt Curator Coverage Report, SEO Strategy Ltd 2026
12 AI bots tracked in the Crawler Analytics tab — including GPTBot (OpenAI), ClaudeBot (Anthropic), PerplexityBot, Google-Extended, and Applebot-Extended LLMs.txt Curator v1.4.6, SEO Strategy Ltd 2026
5 step description fallback chain — Schema markup, SEO plugin meta, WordPress excerpt, Open Graph description, first 160 characters of page content LLMs.txt Curator v1.4.6, SEO Strategy Ltd 2026

LLMs.txt Curator is a free WordPress plugin that generates and maintains your llms.txt and llms-full.txt files — the emerging standard for telling AI systems (ChatGPT, Claude, Perplexity, Google AI Overviews, and others) which pages on your site matter most and what they contain. It is available at wordpress.org/plugins/llms-txt-curator/ and is free under the GPL v2 licence. This page explains why we built it, what it does, and how it fits into a broader LLM Optimisation strategy.

AI systems are reading your site differently now

Search has changed faster in the last two years than in the previous ten. ChatGPT, Claude, Perplexity, and Google AI Overviews do not crawl your site the way Googlebot does — they pull structured signals, look for authoritative sources, and build answers from a much smaller pool of content than you might expect. The llms.txt standard exists to give you a direct channel of communication with those systems. It is a plain-text Markdown file that sits at your site root and tells AI agents: here is who we are, here is what we do, and here are the pages that matter most. Think of it as a robots.txt for AI — but with actual substance. For the full technical guide, including implementation for WordPress, HubSpot, and other platforms, see the complete llms.txt guide.

Auto-generation is not curation

Most llms.txt plugins take the same approach: scan every published post and page, dump them into a flat list, write the file. It takes thirty seconds and requires no thought. That is precisely the problem. An auto-generated file from a typical WordPress site will include your privacy policy, your cookie notice, your Terms and Conditions, your tag archives, your author pages, and dozens of thin or utility pages that have no business being in an AI system’s understanding of what your site is about.

It will group your content arbitrarily — or not at all. It will use no descriptions, or it will pull whatever the SEO plugin meta description happens to be, regardless of whether that description tells an AI anything useful. A poorly curated llms.txt does not just fail to help — it can actively mislead AI systems about what your site is about. Research into agentic AI retrieval systems confirms this quantitatively: auto-generated context files degrade agent performance by 0.5–3% while simultaneously inflating processing costs by up to 20%. The noise is not neutral. It is a measurable drag on both accuracy and efficiency. If your most important pages are buried under fifty utility URLs with no descriptions, the signal you are sending is noise, not authority.

Curation-first, quality-scored, always validated

LLMs.txt Curator is built around one principle: the file you publish should be something you made a decision about, not something that happened automatically. That means you choose which pages go in. You organise them into topical sections — which is how AI systems internally cluster knowledge. You write or approve descriptions. You see a quality score before you publish. The plugin validates for conflicts (noindex pages, canonical mismatches, deleted posts) and tells you in plain English what needs fixing before anything goes live.

Everything the plugin does

Drag-and-drop section builder

Your llms.txt is organised into ## sections — each one represents a topical cluster. AI systems parse these sections as distinct subjects, so grouping matters. The section builder lets you create named sections, search for and add any published page or post type, reorder everything by drag-and-drop, and build a structure that reflects how your site is actually organised. Five pre-built section templates (Business, E-commerce, SaaS, Blog, Local Business) give you a starting skeleton. A Rescan option auto-categorises your existing content by parent hierarchy and WordPress category, which you can then edit and refine.

Five-step description fallback chain

The description next to each URL tells an AI system what it will learn from a page — and it is what most auto-generators either skip or fill with a generic SEO meta description written for a search snippet, not for AI comprehension. LLMs.txt Curator pulls descriptions through a five-step priority chain: Schema markup → SEO plugin meta description (Rank Math, Yoast, AIOSEO, SEOPress, The SEO Framework) → WordPress excerpt → Open Graph description → first 160 characters of page content. You can override any description manually — overrides are never touched by the suggestion engine. Click Generate Missing Descriptions and the plugin fills all gaps automatically, showing which source it used for each entry so you can review and refine.

Quality Score and Coverage Report

Every generated file includes a Coverage Report in the footer — a running summary showing how many pages have descriptions, the overall quality score as a percentage, and the pages still missing descriptions. This is appended to your llms.txt so you can see it at a glance. The target is a score above 90%. A file where fewer than half the pages have descriptions is producing noise, not signal.

Per-page AI title overrides

The title that appears in your llms.txt does not have to match the title on your website. On-page titles are often written for click-through rate. The AI-facing title should answer a different question: what is this page about, precisely? The Tag icon next to each page opens a title override modal. Set a different title for AI consumption without touching anything on-site. Overridden titles display in italic blue in the section builder so you always know which pages have been tuned.

Validation engine and Safety Mode

Before anything goes live, the validator runs a full audit of your configuration. It flags errors (pages that no longer exist or have been unpublished), warnings (noindex conflicts, canonical mismatches, thin content, password-protected pages, files over 50KB, duplicate URLs), and informational notes. Safety Mode — on by default — blocks generation entirely if there are errors. Your settings are saved, but the file on disk is not updated until the issues are resolved. If a page you had curated is deleted or unpublished, the validator finds it and — in most cases — suggests a replacement based on a title match. A one-click fix swaps the old page for the suggested replacement across all sections simultaneously.

Change detection

Every time the plugin regenerates your llms.txt, it takes a snapshot of the last-modified timestamp for every curated page. When you next visit the admin, it checks those timestamps against the current state of your content. If pages have been updated since the last generation, a yellow banner appears above the tab nav with a one-click Regenerate Now button. This means you are never unknowingly serving a stale file.

AI Crawler Analytics

The Crawler Analytics tab answers the most common question about llms.txt: are AI systems actually reading this? When crawler logging is enabled, the plugin monitors requests to your llms.txt and llms-full.txt files, identifying visits from 12 known AI bots including GPTBot (OpenAI), ClaudeBot (Anthropic), PerplexityBot, Google-Extended, Applebot-Extended, and Meta-ExternalAgent. All IP addresses are anonymised. No data leaves your server. The tab shows a 30-day stacked bar chart of daily visits by bot, and an All-Time Summary table with First Seen and Last Seen timestamps for each bot. You can export the full log as a CSV for client reporting.

Import and export as JSON

Your LLMs.txt Curator configuration — your sections, your pages, your descriptions, your title overrides — can be exported at any time as a JSON file. It can be imported back in, in full, on any WordPress site running the plugin. This enables curating with AI assistance (export your config to Claude or ChatGPT, ask it to evaluate your descriptions and suggest improvements, import the refined JSON back), moving configuration between sites, and backing up your editorial work before major content changes.

Prompt to try: export your current configuration, paste it into Claude or ChatGPT alongside your sitemap, and ask it to review descriptions for clarity and AI-usefulness, suggest any sections you are missing, and return the updated JSON in the same format. The result is a file that has had both your editorial judgment and an AI structural analysis applied to it.

WP-CLI, REST API, and Multisite

For developers and agencies managing multiple sites, WP-CLI commands are available: wp llms-txt regenerate, wp llms-txt status, wp llms-txt crawler-log, and wp llms-txt crawler-clear. REST API endpoints cover status, regeneration, crawler statistics, and page data — all requiring manage_options capability. The plugin is network-activation ready for WordPress Multisite: each site manages its own independent llms.txt with no cross-site bleed, and the Network Admin overview shows all sites with per-site regenerate buttons and a bulk Regenerate All action.

Why manual curation produces better outcomes

AI systems weight topical clusters. A flat list of fifty URLs gives an AI system no structural information about how your content relates to itself. Sections provide that structure. An AI reading a well-organised llms.txt can understand your site’s topic coverage far more efficiently than one reading a flat dump. Descriptions tell AI what a page is for, not what it ranks for. SEO meta descriptions are written for click-through rate — they often assume context the reader already has. An AI-facing description should explain what the page contains and what an AI agent will learn from it. What you exclude matters as much as what you include: a privacy policy in your llms.txt signals to the AI that this page defines your site.

If your site has ten substantive pages and no utility pages, a simple auto-generator will do the job. But if your site has more than twenty pages — which most business sites do — the editorial choices become significant. LLMs.txt Curator includes an auto-rescan feature that builds your initial section structure from your WordPress content hierarchy. Use it to get your first draft fast — then edit it. The best output is a rescan followed by deliberate curation, not either one alone.

llms.txt in the context of AI-first SEO

LLMs.txt Curator is one tool in a broader discipline. At SEO Strategy we call this space LLM Optimisation — the practice of ensuring your brand, your expertise, and your content are correctly understood by AI systems at every layer. That broader discipline includes entity SEO (ensuring your brand exists as a coherent, verifiable entity in knowledge graphs and Wikidata), schema markup (giving AI systems structured signals about what each page is and who wrote it), E-E-A-T authorship systems, and Generative Engine Optimisation (GEO) — formatting and positioning content so that it is preferentially retrieved by RAG systems.

llms.txt sits at the discovery layer of this stack. It is the first thing a well-behaved AI agent looks for when it wants to understand a site efficiently. Getting it right is a foundation — not a finish line. For the full strategic picture, see our LLM Optimisation services and the complete llms.txt guide. For the broader AI visibility framework that determines whether you are cited at all, see the AI Discovery Stack and the CITATE framework.

Download LLMs.txt Curator from WordPress.org — free, GPL v2, requires WordPress 6.0+ and PHP 7.4+.

Key Definitions

llms.txt
A plain-text Markdown file placed at a site root that tells AI systems which pages are most important and what they contain. The emerging standard for AI-readable site summaries, proposed by Jeremy Howard and documented at llmstxt.org.
LLMs.txt Curator
A free WordPress plugin by Sean Mullins of SEO Strategy Ltd that takes a curation-first approach to llms.txt generation — letting site owners choose pages, organise sections, write descriptions, validate quality, and track AI crawler visits from 12 known bots.
Safety Mode
A feature in LLMs.txt Curator that blocks file generation if the validator finds errors — preventing a broken or contradictory llms.txt from going live until all issues are resolved.

How to Set Up LLMs.txt Curator

From installation to a quality-scored, validated llms.txt file in under 20 minutes.

  1. 1

    Install and activate

    Install LLMs.txt Curator from the WordPress plugin repository. On activation, the plugin creates default settings and runs an initial rescan of your published content. Navigate to Settings → LLMs.txt Curator.

  2. 2

    Set your site name and description

    The site name becomes the H1 entity name in your llms.txt file. Write the description as two sentences: who you are, what you do, and why you are authoritative on your subject. This is what AI agents read first.

  3. 3

    Build your sections

    Use a template preset as a starting point, or click Rescan to auto-categorise your existing content by WordPress hierarchy and category. Then edit: rename sections to reflect your actual topic clusters, remove pages that do not belong, and search for any pages the rescan missed.

  4. 4

    Fill descriptions

    Click Generate Missing Descriptions to auto-fill using the five-step fallback chain. Review what was generated — especially anything pulled from raw page content rather than a deliberate excerpt or meta description. Edit any descriptions that read like marketing copy rather than informational summaries.

  5. 5

    Run validation

    Click Validate. Review all errors (which block generation in Safety Mode) and warnings (advisory). Resolve blocking errors before proceeding — the validator will suggest replacement pages for any that have been deleted or unpublished.

  6. 6

    Generate and verify

    Click Generate. Your llms.txt and llms-full.txt files are written to your site root. Visit yoursite.com/llms.txt in a browser to confirm the file is live, readable, and showing the Coverage Report with your quality score.

  7. 7

    Use AI assistance to refine

    Export your configuration as JSON. Paste it into Claude, ChatGPT, or another LLM alongside your sitemap. Ask it to evaluate your descriptions, suggest sections you are missing, and identify pages to add or remove. Import the refined JSON back in to apply the improvements.

Frequently Asked Questions

What is llms.txt and why does it matter for AI visibility?

llms.txt is a plain-text Markdown file placed at your site root (e.g. yoursite.com/llms.txt) that tells AI systems which pages on your site are most important and what they contain. It is the emerging standard for AI-readable site summaries, proposed by Jeremy Howard and documented at llmstxt.org. AI agents — including GPTBot, ClaudeBot, and PerplexityBot — look for this file when they want to understand a site efficiently. A well-curated llms.txt helps ensure your most important content is understood in context, rather than being retrieved piecemeal or missed entirely. For the full implementation guide, see the complete llms.txt guide on SEO Strategy.

What is the difference between curation and auto-generation?

Auto-generation scans your published content and dumps everything into a flat list — including privacy policies, cookie notices, tag archives, author pages, and utility pages. Curation means choosing which pages go in, organising them into topical sections, writing descriptions that tell AI systems what each page contains, and seeing a quality score before the file goes live. A curated file sends a cleaner signal about what your site is actually about. LLMs.txt Curator supports both approaches — you can use the Rescan feature to get an auto-categorised first draft, then edit and refine it.

How does the quality score work?

Every generated llms.txt file includes a Coverage Report appended to the footer. This shows the total number of pages included, how many have descriptions, the percentage with descriptions as a quality score, and the list of pages still missing descriptions. The target is a quality score above 90%. A file where fewer than half the pages have descriptions is providing minimal structured information to AI systems — the URLs are present but the context that makes them useful is absent. The quality score is visible in the plugin admin and in the generated file itself.

Which SEO plugins does LLMs.txt Curator integrate with?

LLMs.txt Curator integrates with five major WordPress SEO plugins for the description fallback chain: Rank Math, Yoast SEO, AIOSEO, SEOPress, and The SEO Framework. When you click Generate Missing Descriptions, the plugin checks for a schema article description first (Rank Math schema data or custom _schema_json), then the SEO plugin meta description, then the WordPress excerpt, then the Open Graph description, then the first 160 characters of page content. The source used for each description is shown in the admin so you can review and override anything that does not read well for AI comprehension.

Does the plugin support WordPress Multisite?

Yes. LLMs.txt Curator is network-activation ready for WordPress Multisite. Each site in a network manages its own independent llms.txt file — there is no shared file and no cross-site bleed. Subdirectory sub-sites write a physical file to the site root. Subdomain and domain-mapped sub-sites serve the file via WordPress rewrite rule from the database, preventing filesystem collisions. The Network Admin overview shows all sites with per-site regenerate buttons and a bulk Regenerate All action.

Sean Mullins

Founder of SEO Strategy Ltd with 20+ years in SEO, web development and digital marketing. Specialising in healthcare IT, legal services and SaaS — from technical audits to AI-assisted development.

Ready to improve your search visibility?

Book a free 30-minute consultation and let's discuss your SEO strategy.

Get in Touch