Robots.txt Generator
Generate a valid robots.txt file with crawler-specific rules, common templates, and real-time syntax preview.
Quick Templates
Crawler Rules
Warnings
- • No Sitemap directive — add your sitemap URL for faster indexing.
robots.txt Preview
1User-agent: *2Allow: /
File Information
1
Rules
2
Lines
22B
Size
Understanding Robots.txt
The robots.txt file is the first thing search engine crawlers look for when they visit your website. It tells them which areas of your site they can access and which they should avoid.
In 2026, robots.txt has become even more critical with the rise of AI crawlers like GPTBot (OpenAI), Google-Extended (Gemini), and CCBot (Common Crawl). Our generator includes templates to block these bots while keeping your site accessible to search engines.
Features That Set Us Apart
Per-Bot Configuration
Create separate rules for Googlebot, GPTBot, Bingbot, and more. Most tools only offer a single rule set.
6 Ready Templates
One-click templates for WordPress, E-Commerce, SPA, Block AI, and more — no manual typing needed.
Smart Warnings
Real-time validation catches mistakes like blocking CSS/JS files or missing sitemap directives before you deploy.
Robots.txt Syntax Guide
| Directive | Example | Meaning |
|---|---|---|
| User-agent | User-agent: * | Applies to all crawlers |
| Disallow | Disallow: /admin/ | Block access to /admin/ |
| Allow | Allow: /public/ | Explicitly allow access |
| Crawl-delay | Crawl-delay: 10 | Wait 10s between requests |
| Sitemap | Sitemap: /sitemap.xml | Points to your XML sitemap |