Robots.txt Generator
Create and validate robots.txt files for your website. Control how search engine crawlers access your content.
Crawler Rules
Define rules for different user agents
/
Additional Settings
Optional directives
URL Tester
Test if a URL is allowed or blocked
Preview
User-agent: * Allow: /
Quick Reference
User-agent: *
Applies to all crawlers
Disallow: /path/
Block access to a directory
Allow: /path/
Explicitly allow (overrides Disallow)
Disallow: /*.pdf$
Block all PDF files
Disallow: /*?*
Block URLs with query strings
File Location
Place your robots.txt file in your website's root directory:
https://example.com/robots.txt