Evaluating Digital Content for Accessibility

Evaluating accessibility requires a combination of methods to ensure your digital content meets the needs of all users. Automated tools can quickly identify common issues, manual checks address those requiring human judgment, and user testing provides real-world feedback from people with disabilities. Together, these methods form a robust approach to accessibility evaluation and improvement.

Automated Checks

Automated accessibility evaluation tools are useful for quickly identifying common issues in digital content, but they have limitations and cannot fully replace human evaluation. Explore what an automated accessibility checker what can and cannot do.

Manual Review

Manual accessibility checks are an essential complement to automated tools. They rely on human judgment to assess aspects of accessibility that tools can't fully evaluate. Explore key manual checks and guidance for evaluating headings, color contrast, link text, keyboard navigation, forms, and tables.

Engaging Users with Disabilities

Involving people with disabilities in accessibility testing is an essential step in creating digital content that truly meets the needs of all users. While automated tools and manual reviews identify many issues, user testing provides critical insights into how real users interact with your content or platform.

Templates for outreach and engagement:

Evaluating an Accessibility Conformance Report (ACR)

Voluntary Product Accessibility Templates (VPATs), also known as Accessibility Conformance Reports (ACRs), are documents provided by vendors to outline how their products meet accessibility standards. While VPATs can offer valuable insights, understanding their limitations and knowing how to evaluate them critically is essential to ensure the product meets your accessibility requirements. See the Project and Procurement page for additional information and resources.

On This Page Jump Links
Off