Digital.gov Guide

Understanding the Site Scanning program

This program is available to automatically generate data about the health and best practices of federal websites.
A person works in front of a computer with many internet symbols on it

Overview

Reading time: 2 minutes

The Site Scanning program automates a wide range of scans of public federal websites and generates data about website health, policy compliance, and best practices.

The program is a shared service provided at no cost for federal agencies and the public to use. At its core is the Federal Website Index, a reference dataset listing all public federal .gov sites by agency/department. Daily scans generate over 1.5 million fields of data about 26,000 federal .gov websites, made publicly available via API and bulk download.

We scan federal domains for:

  • The presence of agency websites and subdomains
  • Digital Analytics Program participation
  • Use of the US Web Design System
  • Search engine optimization
  • Third party services
  • IPv6 compliance
  • Other best practices

Access the data directly

All scan data can be downloaded directly as a CSV or JSON file or accessed through the Site Scanning API.

Learn more about the program, the scans, and the underlying data

Much deeper program detail can be found in the program’s documentation hub. The major sections include:

The creation of the underlying website index is explained in the separate Federal Website Index repository. It includes links to the original datasets, as well as descriptions of how they are assembled and filtered in order to create the list of URLs that are then scanned.

Contact the Site Scanning team

Questions? Email the Site Scanning team at site-scanning@gsa.gov.