robots.txt for AI
Configure robots.txt to allow AI crawlers.
Last updated: January 20, 2026
Your robots.txt file controls which AI crawlers can access your content. Many sites unknowingly block AI systems.
AI Crawler User Agents
- GPTBot - OpenAI's crawler for ChatGPT
- ChatGPT-User - ChatGPT's browsing feature
- ClaudeBot - Anthropic's crawler for Claude
- PerplexityBot - Perplexity's crawler
- Google-Extended - Google's AI crawler
Recommended Configuration
# Allow all AI crawlers
User-agent: GPTBot
Allow: /
User-agent: ChatGPT-User
Allow: /
User-agent: ClaudeBot
Allow: /
User-agent: PerplexityBot
Allow: /
User-agent: Google-Extended
Allow: /
# Traditional search engines
User-agent: Googlebot
Allow: /
User-agent: Bingbot
Allow: /
# Sitemap
Sitemap: https://yoursite.com/sitemap.xmlSelective Access
Allow AI crawlers on public content while blocking sensitive areas:
User-agent: GPTBot
Allow: /blog/
Allow: /docs/
Disallow: /admin/
Disallow: /api/
Disallow: /dashboard/Check your current robots.txt! Many sites have a blanket
Disallow: /for unknown bots that blocks AI crawlers.