Docs/SEO/AI Crawler Visibility

AI Crawler Visibility

Check if AI bots can access your content.

Why It Matters#

AI assistants like ChatGPT, Claude, and Perplexity are increasingly used to find and recommend content. If your site blocks AI crawlers, your content may not appear in AI-generated responses.

SlyDuck analyzes your robots.txt file to show which AI crawlers can access your site.

Growing Importance

As AI assistants become more common, visibility to AI crawlers is becoming as important as traditional SEO. Users asking AI for recommendations may never find sites that block these crawlers.

AI Crawlers Detected#

SlyDuck checks for access by these AI crawlers:

  • GPTBot: OpenAI's crawler (powers ChatGPT)
  • ChatGPT-User: ChatGPT browsing feature
  • Claude-Web: Anthropic's Claude assistant
  • PerplexityBot: Perplexity AI search
  • Google-Extended: Google's AI training crawler
  • Bytespider: ByteDance's AI crawler
  • CCBot: Common Crawl (used for AI training)
  • FacebookBot: Meta's AI systems

Understanding Your Status#

For each AI crawler, SlyDuck shows:

  • Allowed (Green): This AI crawler can access your site
  • Blocked (Red): Your robots.txt blocks this crawler
  • Partial (Yellow): Some paths are blocked

What "Allowed" Means

If a crawler is allowed, it can:

  • Read your public pages
  • Include your content in AI responses
  • Potentially use your content for training (depending on the crawler)

What "Blocked" Means

If a crawler is blocked:

  • It cannot read your pages
  • Your content won't appear in that AI's responses
  • Your content won't be used for training by that company

Managing AI Access#

To control AI crawler access, edit your robots.txt file:

Block All AI Crawlers

Add these lines to your robots.txt:

User-agent: GPTBot
Disallow: /

User-agent: ChatGPT-User
Disallow: /

User-agent: Claude-Web
Disallow: /

User-agent: PerplexityBot
Disallow: /

Allow All AI Crawlers

Remove any User-agent rules for AI crawlers from your robots.txt, or explicitly allow them:

User-agent: GPTBot
Allow: /

User-agent: ChatGPT-User
Allow: /

Recommendation

For most sites, we recommend allowing AI crawlers. This increases your visibility in AI-generated responses, which is becoming an important discovery channel.