Valid
- User-agent: *
- Disallow: /private
- Allow: /public
- Sitemap: https://example.com/sitemap.xml
web validator
Validate robots.txt syntax, user-agents, and allow/disallow rules locally.
Requires at least one User-agent directive.
Counts Allow/Disallow directives per agent and keeps them grouped.
Parses Sitemap lines to confirm absolute URLs.
Runs fully client-side; no fetches or uploads.
robots.txt content
Paste the full robots.txt. Keep User-agent blocks clear and include absolute sitemap URLs when applicable.
This robots.txt validator checks user-agent blocks, allow/disallow directives, and sitemap lines entirely in your browser.
Use it to QA CMS/CDN-generated robots.txt files before shipping so crawlers receive clean directives.
All validation happens in your browser. No data is sent, logged, or stored.
Syntax-focused; does not simulate full crawler behavior or precedence.