How BotView Scores Work

Every page starts at 100 and loses points only for issues that directly impact whether crawlers can see your content.

What We Measure

BotView measures crawler visibility — can Googlebot and AI crawlers actually see and understand your content? This is different from traditional SEO audits that check meta tag length or keyword density.

We render each page using Playwright with a Googlebot user agent, then compare the initial HTML (what crawlers see first) against the fully rendered DOM (what JavaScript adds later). The gap between these two tells us how much content depends on JavaScript to appear.

Traditional SEO factors like meta description length or keyword usage are reported in the scan but do not affect the score. Only issues that directly prevent crawlers from seeing content count against you.

How the Score is Calculated

Each page starts at 100. Issues deduct points based on severity.

-15

Critical

Content is invisible or inaccessible to crawlers. Bot walls, heavy JS dependency, error status codes.

-5

Warning

Content is partially visible but degraded. Moderate JS dependency, slow rendering, blocked resources.

-1

Info

Minor visibility concerns. Small rendering differences, missing alt text on decorative images.

Multi-Page Scoring

When scanning multiple pages, the overall site score uses a weighted average: 70% straight average across all pages, 30% weighted toward the lowest-scoring pages. This means your worst pages pull the score down more than your best pages pull it up — because a single broken page can hurt your site's crawl visibility disproportionately.

The Five Visibility Groups

Every visibility issue falls into one of five groups. Each group gets its own sub-score in the report.

1. Rendering

Can crawlers see your content without executing JavaScript?

JS content dependencySPA shell detectionClient-side renderingLoading states in HTMLJS-rendered headings & links

2. Crawler Access

Are crawlers allowed to reach your pages and their resources?

AI crawler blocking (robots.txt)Blocked JavaScript/CSS/fontsBot walls & WAFsHTTPS availability

3. Indexability

Will search engines actually add your page to their index?

noindex directivesSoft 404 detectionError status codesRedirect chainsHreflang errors

4. Performance

Can crawlers finish rendering your page within their time budget?

Page load timeDOM ready timingRender budget exceededFirst & Largest Contentful Paint

5. Content Visibility

Is your important content actually visible in the DOM?

Hidden content (display:none)Modal/dialog contentInfinite scrollMissing internal linksEmpty anchor text

Score Grades

A
90–100

Excellent

Crawlers can see virtually all content

B
70–89

Good

Most content is visible, minor issues

C
50–69

Needs Work

Significant visibility gaps

D
0–49

Poor

Crawlers can't see most content

What Doesn't Affect the Score

BotView reports on many traditional SEO factors, but these are informational only. They appear in your report but don't change your visibility score:

  • Meta description length or content
  • Title tag length
  • Open Graph / social tags
  • Image file formats
  • Keyword density or usage
  • Number of headings
  • Structured data completeness
  • Mobile viewport tag

These factors matter for SEO, but they don't determine whether crawlers can see your content — which is what BotView measures.

See Your Score

Run a free scan to see exactly what Google and AI crawlers can see on your website.

https://

Free scan — no account required. Takes 30 seconds.

Free scan, no signup required