Presets
Rules
Generated robots.txt
User-agent: *
About Robots.txt Generator
Robots.txt Generator creates a properly formatted robots.txt file for your website through a visual interface — specify which crawlers to allow or block, set crawl delay, and define which directories to exclude from indexing. Download the finished file and upload it to your website's root.
How to Use Robots.txt Generator
- 1
Configure crawl permissions
Select which user-agents (Googlebot, Bingbot, all bots) to configure rules for, and set their allowed and disallowed paths.
- 2
Add your sitemap URL
Enter your XML sitemap URL to include a Sitemap directive, helping search engines discover your sitemap.
- 3
Download and upload the file
Copy the generated robots.txt content and upload it to the root of your website (e.g., yoursite.com/robots.txt).
Common Use Cases
- Blocking search engines from crawling admin, login, or staging pages
- Specifying a sitemap URL for faster search engine discovery
- Setting crawl delay rules to prevent overly aggressive bot crawling
- Allowing all bots to crawl all content on a newly launched site
Frequently Asked Questions
What is a robots.txt file?
Does robots.txt prevent pages from appearing in search results?
Should I block any directories by default?
Related Tools
.htaccess Generator
Generate Apache .htaccess rules for redirects, security, and caching.
Meta Tag Generator
Generate HTML meta tags for SEO, social sharing, and more.
Google SERP Preview
Preview how your page will look in Google search results.
Password Generator
Generate strong, secure random passwords with customizable options.
UUID Generator
Generate universally unique identifiers (UUID v4) instantly.
QR Code Generator
Generate QR codes from any text or URL for easy sharing.