Three root causes explain 90% of AI crawlability failures:
noai or positive directives, AI behaviour is undefinedCheck your https://yoursite.com/robots.txt for these user-agents:
User-agent: GPTBot
User-agent: Google-Extended
User-agent: ClaudeBot
User-agent: PerplexityBot
User-agent: ChatGPT-User
What to look for:
Disallow: / — Blocked from all content (FAIL)Allow: / — Full access (PASS)Inspect the of key pages:
<!-- Bad: blocks AI -->
<meta name="robots" content="noai, noimageai">
<!-- Good: allows AI -->
<meta name="robots" content="index, follow">
<meta name="GPTBot" content="index, follow">
Common failures:
noai without realising it blocks all AI crawlersindex directives for AI botsAI crawlers prioritise pages with schema markup. Use the Schema Markup Validator to test:
Check for these files at your root:
/ai.txt — AI crawler instructions (emerging IETF draft)/llms.txt — LLM-friendly site summaryManual checks are time-consuming. Use UltraScout's AI SEO Score to get a complete audit in 10 seconds:
If your score is below 70, upgrade to the AI Search Readiness Audit (£29) for:
Check your AI SEO Score — free, instant, no sign-up required
Get My Free AI SEO Score → View Full Audit Report