Robots.txt Tester
Guide
Robots.txt Tester
Paste your robots.txt file and test whether specific URLs are allowed or blocked for any crawler. Validates syntax against RFC 9309, highlights errors, and extracts sitemap references.
How to Use
Paste your robots.txt content into the input field, enter a URL path to test, and select a User-agent. The tool instantly shows whether the URL is allowed or blocked, which rule matched, and any syntax issues in your robots.txt file.
Features
- RFC 9309 Compliant Parsing – Spec-compliant robots.txt parser with proper directive precedence
- URL Pattern Matching – Full wildcard support with * and $ patterns
- User-Agent Selection – Test against Googlebot, Bingbot, GPTBot, ClaudeBot, and more
- Syntax Validation – Highlights malformed directives, unknown fields, and common mistakes
- Matching Rule Display – Shows exactly which Allow/Disallow rule matched and why
- Sitemap Extraction – Lists all Sitemap directives found in the file
- Crawl-Delay Detection – Identifies crawl-delay settings per User-agent
- Example Robots.txt – Pre-loaded examples for quick testing
Install Our Extensions
Add IO tools to your favorite browser for instant access and faster searching
恵 Scoreboard Has Arrived!
Scoreboard is a fun way to keep track of your games, all data is stored in your browser. More features are coming soon!
Must-Try Tools
View All New Arrivals
View AllUpdate: Our latest tool was added on Mar 14, 2026
