Valid
- User-agent: *
- Allow: /
- Sitemap: https://example.com/sitemap.xml
web validator
Validate robots.txt with directive-by-directive checks and clear line-level issues.
Requires at least one User-agent directive.
Flags directives before a User-agent is defined.
Validates Sitemap as an absolute URL and crawl-delay as a number.
Flags unknown directives and missing values.
robots.txt
Paste the full robots.txt. Keep blocks grouped by User-agent and use absolute Sitemap URLs.
This strict robots.txt validator enforces user-agent directives, sitemap URL validity, and directive ordering entirely in your browser.
Use it to harden robots files generated by CMS/CDN before shipping to crawlers.
All validation happens in your browser. No data is sent, logged, or stored.
Syntax-level validation only; does not simulate crawlers.