Skip to main content
ScanGov
Standards
Project
Docs
ScanGov Standards
Guidance
Government digital experience standards
Based on public policy, web protocol, guidelines and best practices.
Copy link
Share
LinkedIn
Bluesky
X
Facebook
All
Accessibility
AI-friendly
Content
Domain
Performance
Security
SEO
Social
Indicators
Standard
Why
Guidance
Sitemap status
The HTTP status of /sitemap.xml is OK.
Confirms the sitemap is accessible, ensuring search engines can easily find and index all pages on the site.
sitemaps.org
Sitemap XML
The sitemap file type is XML.
Stores site structure in a readable format, helping search engines efficiently crawl and index all website pages.
sitemaps.org
Robots valid
The site has a valid robots policy.
Guides search engines on which pages to crawl or avoid, ensuring important content is indexed and irrelevant pages aren't.
The Web Robots Pages
Robots allowed
The robots policy allows access to browsers and scrapers.
Permits search engines and web tools to access content, helping improve search visibility and gather relevant data.
The Web Robots Pages
Sitemap in robots.txt
The robots.txt file points to a sitemap file.
Helps search engines find the sitemap quickly, improving how they discover and index website pages.
Google Search Central
Canonical
Use preferred page URLs to avoid duplication.
Prevents duplicate content issues by telling search engines which version of a page is the main one.
The Web Robots Pages
Google Search Central
Link text
Links have descriptive text.
Describes the link’s purpose clearly, helping users know where it leads and improving navigation for everyone.
Google Search Central
hreflang
Specifies language and region for webpages.
Indicates page language and region, helping users see the right version and improving search results in different countries.
Google Search Central
Feedback