Your robots.txt file tells search engine bots which parts of your website to crawl - and which to leave alone. A misconfigured file can accidentally block your entire site from Google. Our Robots.txt Generator lets you build the right file visually, without touching a single line of code. Choose a preset, customise your rules, and download a ready-to-upload file in seconds.
Robots.txt Generator
Build a complete, valid robots.txt file without touching any code. Choose a preset, customise your rules, then copy or download the finished file.
Quick Start Preset
Global Settings
Most WordPress sites use /sitemap.xml or /sitemap_index.xml
Tells Yandex the preferred version of your domain (www vs non-www).
User-Agent Rules
Each block applies to one bot (or all bots with *). Rules are applied in order.
robots.txt Preview
Configure your rules and click Generate robots.txt to see the output.
How to Deploy
- Generate your robots.txt using the builder on the left.
- Click Download robots.txt to save the file.
- Upload the file to the root directory of your website via FTP/SFTP - the same folder that contains your index.php.
- Verify it is live by visiting
https://yourdomain.com/robots.txtin your browser. - Submit the URL to Google Search Console under Settings → robots.txt for immediate indexing.
⚠️ Replace any existing robots.txt file carefully - a misconfigured file can block your entire site from search engines.
Why Choose Our Robots.txt Generator?
- Five quick-start presets: Allow All, Block All, WordPress Standard, Block AI Crawlers, and Custom.
- Full support for all major bots - Google, Bing, Yahoo, Yandex, Baidu, and more.
- Dedicated Block AI Crawlers preset covering GPTBot, anthropic-ai, Claude-Web, Google-Extended, PerplexityBot, Bytespider, and CCBot.
- Add unlimited User-agent blocks, each with their own Disallow, Allow, and Crawl-delay settings.
- Live syntax-highlighted preview updates as you make changes.
- One-click Download – produces a correctly named robots.txt file ready to upload.
- Completely free. No account or login needed.
Our robots.txt generator is perfect for:
- WordPress site owners wanting to block admin and login pages from crawlers.
- Website owners who want to prevent AI companies from training on their content.
- SEO professionals building robots.txt files for client websites quickly and accurately.
- Developers who want a visual starting point before fine-tuning a file manually.
- Anyone moving a site from development to production who needs to remove a blanket block.
How to Use Our Robots.txt Generator:
- Choose a preset that best matches your needs or select Custom to start from scratch.
- Enter your Sitemap URL in the Global Settings section (highly recommended).
- Review the User-Agent rules added by the preset and adjust the paths as needed.
- Add extra User-Agent blocks for specific bots using the “Add User-Agent Rule” button.
- Click “Generate robots.txt” to see the live, syntax-highlighted preview.
- Click “Download robots.txt” and upload the file to the root of your website.
Once uploaded, verify your file is live by visiting https://yourdomain.com/robots.txt in your browser. Then submit it to Google Search Console under Settings → robots.txt to ensure Google picks up the new rules immediately.
Frequently Asked Questions
What is a robots.txt file and why does it matter?
A robots.txt file sits in the root directory of your website and gives instructions to web crawlers about which pages they can and cannot access. It does not prevent pages from being indexed if other sites link to them, but it does tell well-behaved bots such as Googlebot and Bingbot to skip certain sections - like admin pages, login screens, or duplicate content areas.
Will blocking bots in robots.txt hurt my SEO?
Only if you block the wrong paths. Blocking Googlebot from crawling your core pages will prevent them from being indexed. Our WordPress Standard preset blocks commonly crawled but SEO-irrelevant paths - like /wp-admin/ and /wp-includes/ - while leaving all public content open. Always check your generated file carefully before uploading.
Can I block AI crawlers from scraping my content?
Yes. Our “Block AI Crawlers” preset adds Disallow: / rules for all major known AI training and browsing bots, including GPTBot (OpenAI), anthropic-ai (Anthropic/Claude), Google-Extended (Gemini), PerplexityBot, Bytespider (TikTok), and CCBot (Common Crawl). Note that not all AI crawlers respect robots.txt – this covers the ones that do.
Where do I upload my robots.txt file?
The robots.txt file must be placed in the root directory of your website - the same folder that contains your index.php or index.html. For most hosting providers this is the public_html or www folder. Once uploaded, you can verify it by visiting https://yourdomain.com/robots.txt in any browser.
What is the Crawl-delay directive?
Crawl-delay tells a bot how many seconds to wait between requests to your server. Setting a crawl delay of 10, for example, means the bot will wait 10 seconds between each page it crawls. This is useful for servers with limited resources. Note that Googlebot ignores Crawl-delay - you can control Googlebot’s crawl rate directly in Google Search Console instead.
