Sitemap status
The HTTP status of /sitemap.xml is OK.
(How ScanGov measures tasklist priorities.)
A sitemap is a file that lists all of the pages on a website in addition to information about each so that search engines can more intelligently crawl the site.
Web crawlers usually discover pages from links within the site and from other sites. Sitemaps supplement this data to allow crawlers that support sitemaps to pick up all Uniform Resources Locators (URLs) in the sitemap and learn about them using the associated metadata. Using sitemap protocol doesn’t guarantee web pages are included in search engines, but provides hints for web crawlers to do a better job of crawling your site.
The sitemap is an Extensible Markup Language (XML) file on the website’s root directory that include metadata for each URL, such as:
As a search engine bot or user, I want the website's sitemap to be easily accessible so that I can discover all the pages and content on the site efficiently and improve the website's discoverability.
(ScanGov messaging when a site fails a standard)
Sitemap.xml is missing or inaccessible.