Support
Dev Tools / Generate robots.txt Files

Browse by category

All categories

Generate robots.txt Files

Generate a robots.txt file with custom rules for search engine crawlers and bots. Safe for secrets.

1
Set rules
Configure allow and disallow rules for each user-agent
2
Add sitemap
Optionally specify your sitemap URL
3
Copy or download
Copy the robots.txt content or download the file

robots.txt Builder

Crawl-delay (seconds)

Generated robots.txt

User-agent: *
Generated · valid1 lines
  • Built-in validator runs every keystroke. Errors block syntactically broken output; warnings flag conflicting Allow/Disallow on the same path within a UA group.
  • Use the Block AI bots preset to add a User-agent block for GPTBot / ClaudeBot / CCBot / Google-Extended / anthropic-ai / ChatGPT-User in one click.
  • Sitemap URLs must be absolute (https://example.com/sitemap.xml).

Frequently Asked Questions

What is robots.txt?

A text file that tells search engine crawlers which pages or sections of your site to crawl or skip.

Does it validate the syntax?

Yes. The generator only produces valid robots.txt syntax following the standard.

Is it safe to paste production data?

Yes. The robots.txt is generated client-side. Internal paths, staging hostnames, and any directives you do not want logged stay on your machine.

This tool is free thanks to our sponsors. Support Loft Tools