Webmaster Tools
Robots.txt Generator
Create a robots.txt file to guide search engine crawlers.
Rule 1
No disallow rules (all paths allowed)
No allow rules (default behavior)
Sitemap (Optional)
Generated robots.txt
robots.txt Output
How to use Robots.txt Generator
1
Configure user agents
Set which search engine crawlers the rules apply to (use '*' for all crawlers).
2
Add allow/disallow rules
Specify which paths crawlers can or cannot access. Use '/' to allow/disallow everything.
3
Add sitemap URL
Optionally include your sitemap URL to help search engines discover your pages.