Can ChatGPT Actually Read Your Website?
We Scanned 20 Popular SaaS Sites to Find Out
AI crawlers like GPTBot, ClaudeBot, and PerplexityBot don't render JavaScript. We scanned 20 well-known SaaS homepages 3 times each to measure what they actually see.
Average Score
Sites That Timed Out
Sites Scoring Below 60
Sites With Bot Walls
The Short Version
We picked 20 well-known SaaS companies (SEO tools, website builders, productivity apps, content platforms) and scanned each homepage 3 times using BotView. Every site in this study depends on organic search traffic, so they should be optimized for crawlers.
The average AI visibility score was 62 out of 100. That means most SaaS homepages show AI crawlers a degraded version of their site. Four sites didn't load at all. Their pages took over 30 seconds and the crawler gave up.
Here's what surprised us: it's not about whether you allow AI crawlers in your robots.txt. 19 of 20 sites do. The real question is whether bots can actually load and read your content. Most can't.
Methodology
20 sites across 6 categories: SEO tools (HubSpot, Ahrefs, Semrush, Moz), website builders (Webflow, Squarespace, Wix, Framer), content platforms (Substack, Medium, WordPress.com), productivity SaaS (Notion, Canva, Monday.com, Zapier, Calendly), e-commerce (Shopify), and communication tools (Intercom, Mailchimp, Zendesk).
3 scans per site (60 total) to reduce noise from CDN caching, A/B tests, or one-off timeouts. Scores were averaged across all 3 rounds. Where scores varied between rounds, we note it.
What we measured: Overall bot visibility score, page load time, AI crawler access (robots.txt for 14 crawlers), bot wall detection, JavaScript rendering gap (initial HTML vs rendered DOM), image alt text coverage, structured data presence, and content structure.
Each scan renders the page using a headless Chromium browser with a crawler user agent, then compares what a human sees versus what a bot sees.
Full Rankings
Sorted by AI crawler visibility score, averaged across 3 scans
| # | Site | Score | Load Time |
|---|---|---|---|
| 1 | Squarespace | 89 | 1.6s |
| 2 | Intercom | 82 | 2.9s |
| 3 | Shopify | 79 | 1.6s |
| 3 | Semrush | 79 | 2.7s |
| 3 | Moz | 79 | 4.2s |
| 3 | Framer | 79 | 4.1s |
| 7 | Notion | 74 | 4.5s |
| 8 | HubSpot | 64 | 5.3s |
| 9 | Canva | 63 | 2.8s |
| 10 | WordPress.com | 62 | 2.4s |
| 10 | Substack | 62 | 3.0s |
| 12 | Webflow | 59 | 7.1s |
| 13 | Ahrefs | 55 | 3.0s |
| 14 | Zapier | 54 | Timeout |
| 14 | Medium | 54 | Timeout |
| 16 | Zendesk | 52 | 14.3s |
| 17 | Wix | 49 | Timeout |
| 17 | Mailchimp | 49 | Timeout |
| 19 | Monday.com | 34 | 19.0s |
| 20 | Calendly | 28 | 12.8s |
Scores averaged across 3 scans per site. Scanned February 2026 using BotView.
Finding 1: Four Sites Never Loaded
Zapier, Wix, Medium, and Mailchimp timed out on every single scan. All 3 attempts, every time, averaging over 30 seconds. When a page takes that long, AI crawlers just give up. As far as ChatGPT, Claude, or Perplexity are concerned, these sites don't exist.
According to SEO.ai's research, AI crawlers like GPTBot only see raw HTML. They don't render JavaScript. Heavy JS bundles, third-party scripts, and client-side rendering push load times past what crawlers will tolerate.
Sites that timed out (30s+ on every scan)
Finding 2: Bot Walls Are Everywhere
6 of 20 sites triggered a bot wall. These are security layers (usually Cloudflare) that detect and challenge automated visitors. They're good at stopping attacks, but they also block legitimate AI crawlers.
The one that stood out: Ahrefs, one of the most popular SEO tools in the world, has a Cloudflare bot wall on its own homepage. On top of that, 52% of its content is loaded via JavaScript. So even if a crawler gets past the wall, it only sees about half the page.
Finding 3: JavaScript Makes Sites Invisible to AI
Unlike Googlebot, which can render JavaScript, most AI crawlers only see the initial HTML response. As OpenAI's own documentation confirms, GPTBot does not use a full browser to render pages. If your content is loaded via JavaScript, ChatGPT literally cannot see it.
Worst JS rendering gaps
Bots see 28% of the page. Even the H1 heading is rendered via JavaScript.
Combined with a bot wall, AI crawlers see a fraction of the actual content.
Several other sites had headings rendered via JavaScript too, including HubSpot (208 of 242 internal links loaded via JS), Zapier, Moz, Shopify, and Wix. Any crawler that doesn't execute JavaScript won't see these elements at all.
Finding 4: Almost Nobody Blocks AI Crawlers (But One Site Does)
19 of 20 sites allow all major AI crawlers in their robots.txt. The SaaS industry has overwhelmingly decided that AI visibility is worth it.
The exception is Canva, which actively blocks GPTBot (ChatGPT) and ClaudeBot (Claude), allowing only 5 of 14 AI crawlers through. Calendly blocks CCBot (Common Crawl), the dataset used to train many large language models.
Bottom line: the bottleneck for AI visibility isn't permissions, it's performance and rendering. Most sites have the right robots.txt settings. Most sites still fail the actual visibility test because the content never loads fast enough or lives behind JavaScript.
Finding 5: Alt Text Is a Universal Failure
16 of 20 sites have images with missing alt text. Without alt text, AI crawlers have no idea what an image shows. Product screenshots, diagrams, feature demos: all invisible.
| Site | Images | Missing Alt | % Missing |
|---|---|---|---|
| Canva | 13 | 13 | 100% |
| Squarespace | 83 | 81 | 97% |
| Monday.com | 505 | 400 | 79% |
| Webflow | 227 | 159 | 70% |
| Shopify | 69 | 44 | 63% |
| Substack | 17 | 10 | 58% |
| Notion | 30 | 16 | 53% |
| Zendesk | 97 | 41 | 42% |
Monday.com's homepage has 505 images. 400 of them have no alt text. Webflow, the website builder, has 159 images with no alt text on its own homepage. These are well-funded companies with dedicated marketing teams.
Finding 6: Nobody Gets Structured Data Right
Structured data (JSON-LD schema markup) helps search engines and AI crawlers understand what a page is about. It's the difference between a crawler seeing raw text and understanding that this is an organization, a product, or an article.
Zero JSON-LD (4 sites)
- Substack
- WordPress.com
- Notion
- Zapier
Incomplete Schema (16 sites)
Missing BreadcrumbList, incomplete Organization schema, or structured data blocks without @type properties.
Not a single site in our study had fully complete structured data. This is one of the easiest wins in both GEO and traditional SEO. It takes a few minutes to add and helps every crawler that visits your site.
What You Can Do About It
Based on what we found across these 20 sites, here are the highest-impact actions you can take to improve your AI crawler visibility:
Check your actual load time for bots
Not your Lighthouse score. Your actual page load as seen by a crawler user agent. If it's over 5 seconds, you're losing visibility. Over 30 seconds and you're invisible.
Reduce JavaScript dependency for key content
Your H1, main copy, and navigation should be in the initial HTML. AI crawlers don't execute JavaScript. Server-side rendering or static generation solves this.
Check if your bot wall blocks AI crawlers
If you use Cloudflare, Akamai, or similar: verify that GPTBot, ClaudeBot, and PerplexityBot can get through. Many WAF configurations block them by default.
Add alt text to every image
AI crawlers can't see images. They can only read alt text. Every image without it is invisible content, especially product screenshots and diagrams.
Add structured data
At minimum: Organization schema on your homepage, BreadcrumbList for navigation. It takes 10 minutes and helps every crawler understand your site.
Allow AI crawlers in robots.txt
Check that GPTBot, ChatGPT-User, ClaudeBot, and PerplexityBot are not blocked. If you want to appear in AI-powered search results, these bots need access.
Frequently Asked Questions
Can ChatGPT read my website?
It depends. ChatGPT uses GPTBot to crawl websites, and GPTBot only sees raw HTML. It does not render JavaScript. If your site relies on client-side rendering, ChatGPT may only see a fraction of your content. Our study found the average SaaS site scores 62 out of 100 for AI crawler visibility, meaning bots see a degraded version of most sites.
What AI crawlers does this study cover?
We check 14 AI crawlers including GPTBot (ChatGPT training), ChatGPT-User (ChatGPT browsing), OAI-SearchBot (ChatGPT search), ClaudeBot (Anthropic), PerplexityBot, CCBot (Common Crawl, used by many AI companies), Google-Extended, Bytespider, and others.
Why do some sites time out for AI crawlers?
Heavy JavaScript, large bundles, slow server responses, and third-party scripts can push page load times past 30 seconds. When that happens, GPTBot and similar bots just give up and move on. The site effectively doesn't exist for that AI system. In our study, 4 of 20 sites timed out on every single scan.
What is a bot wall?
A bot wall is a security measure (often Cloudflare, Akamai, or similar) that detects and blocks automated visitors. While bot walls protect against malicious bots, they can also block legitimate AI crawlers like GPTBot and ClaudeBot. We detected bot walls on 6 of 20 sites in this study.
Does blocking AI crawlers help or hurt my site?
It depends on your goals. Blocking GPTBot means your content won't appear in ChatGPT answers. With AI-powered search growing rapidly, that means potential customers may never find you through AI channels. In our study, only 1 of 20 SaaS sites actively blocks major AI crawlers. The industry overwhelmingly chooses visibility.
How can I check what AI crawlers see on my site?
BotView scans your site from an AI crawler's perspective. It renders the page as Googlebot would, checks robots.txt for all 14 AI crawlers, measures load times, detects bot walls, and shows how much content is actually visible in the initial HTML vs what gets loaded by JavaScript. The free plan includes 3 scans per month.
Check Your Own Site
See what AI crawlers actually see when they visit your website. Free scan, no credit card required.
Checks 14 AI crawlers including GPTBot, ClaudeBot, and PerplexityBot