Robots.txt Generator
Free robots.txt generator. Create a robots.txt file with allow/disallow rules, crawl-delay, and sitemap URL — then copy or download it.
A robots.txt file tells search engine crawlers which parts of your site they can and can’t crawl. Use it to block private areas, staging paths, or faceted pages that shouldn’t be indexed.
How to use the robots.txt generator
- Pick a mode (allow all, block all, or custom rules).
- Add allow/disallow paths if needed.
- Optionally add a sitemap URL.
- Copy or download the robots.txt file.
Notes
Robots.txt doesn’t remove indexed pages: it controls crawling, not indexing.
Frequently Asked Questions
What is robots.txt?
Where do I put the robots.txt file?
yoursite.com/robots.txt. This is the only location crawlers check. Placing it elsewhere or using a different filename means search engines won't find it.Should I block crawlers from my entire site?
What's the difference between Allow and Disallow?
Disallow: /admin/). Allow explicitly permits access to paths within a disallowed area (e.g., allowing /admin/public/ while blocking /admin/private/). Allow is optional and only needed for exceptions.Should I include my sitemap in robots.txt?
Sitemap: https://yoursite.com/sitemap.xml helps search engines discover your sitemap automatically. While you can also submit sitemaps via Google Search Console and Bing Webmaster Tools, including it in robots.txt provides a fallback.