Free Tool

AI robots.txt Generator

Choose which AI crawlers to allow or block, then copy-paste the generated rules into your robots.txt file.

Training Crawlers

Collect data for AI model training

Search Crawlers

Index pages for AI search citations

Browsing Crawlers

Fetch pages on-demand when users ask

Generated robots.txt Rules

# AI Crawler Rules
# Generated by BotView (https://botview.app/ai-robots-txt-generator)
# 2026-04-16

# Allowed AI crawlers
User-agent: GPTBot
Allow: /

User-agent: ClaudeBot
Allow: /

User-agent: Google-Extended
Allow: /

User-agent: CCBot
Allow: /

User-agent: Diffbot
Allow: /

User-agent: cohere-ai
Allow: /

User-agent: anthropic-ai
Allow: /

User-agent: Applebot-Extended
Allow: /

User-agent: FacebookBot
Allow: /

User-agent: OAI-SearchBot
Allow: /

User-agent: PerplexityBot
Allow: /

User-agent: ChatGPT-User
Allow: /

User-agent: Claude-Web
Allow: /

# Blocked AI crawlers
User-agent: Bytespider
Disallow: /

Add these rules to your existing robots.txt file. They won't affect Googlebot, Bingbot, or any other search engine crawlers.

How robots.txt Works for AI Crawlers

The robots.txt file sits at the root of your website (e.g., yoursite.com/robots.txt). Well-behaved crawlers check this file before accessing your pages. Each AI company has registered specific user-agent strings for their bots.

The key thing to understand: a wildcard User-agent: * rule with Disallow: / blocks every AI crawler unless you add specific Allow rules for each one. Many websites accidentally block all AI crawlers this way.

Each AI crawler's rules are independent. Allowing GPTBot doesn't allow ClaudeBot. Blocking PerplexityBot doesn't block Google-Extended. That's why you need a separate rule block for each crawler you want to control.

Frequently Asked Questions

Do I need separate robots.txt rules for each AI crawler?

Yes. Each AI crawler has its own user-agent string. A rule for GPTBot doesn't affect ClaudeBot or PerplexityBot. You need a separate User-agent block for each crawler you want to control. This generator creates all the rules you need.

Where do I put these rules in my robots.txt?

Add the generated rules to your existing robots.txt file, which lives at your domain root (e.g., example.com/robots.txt). You can add AI crawler rules anywhere in the file. They won't affect your existing Googlebot or Bingbot rules.

Will blocking AI crawlers affect my Google rankings?

No. Blocking AI crawlers like GPTBot, ClaudeBot, or PerplexityBot has zero effect on your Google Search rankings. Even blocking Google-Extended only affects Gemini AI training, not your search position.

Can I allow ChatGPT Search but block ChatGPT training?

Yes. Allow OAI-SearchBot (search citations) while blocking GPTBot (training data). This is a popular configuration. Your site gets cited in ChatGPT answers without contributing to model training.

How long until crawlers notice my robots.txt changes?

Most AI crawlers re-check robots.txt every few days to weeks. There's no instant effect. After updating your robots.txt, use BotView to verify the rules are correctly formatted.

Verify Your robots.txt After Updating

After adding rules, scan your site with BotView to confirm all 14 AI crawlers see the right access level.

Scan Your Site Free

No signup needed