TrustedTester

A standardized approach for manual inspection of Web content for conformance with the Revised Section 508 Standards.

DRAFT Trusted Tester - Section 508 Conformance Test Process for Web

United States Federal Chief Information Officers Council (CIOC) Accessibility Community of Practice (ACOP)

November 2018 | Version 5.0 DRAFT

About This Document

Who Should Use This Document

This document has been designed for and is intended for use by Trusted Testers.

A Trusted Tester is a person certified to provide accurate and repeatable Revised 508 conformance test results for web content. A Trusted Tester follows the Revised Section 508 Conformance Test Process for Web, uses approved testing tools, and evaluates web applications for conformance with Revised 508 standards. Trusted Testers are those who have passed the Trusted Tester Certification Exam.

For more information on the Revised 508 Trusted Tester Training Course and Exam, contact the Department of Homeland Security Office of Accessible Systems & Technology (OAST) Accessibility Helpdesk at accessibility@dhs.gov.

Harmonized Baseline Alignment

This test process incorporates all tests in the "Harmonized Processes for Revised 508 Testing: Baseline Tests for Web Accessibility" version 3.0. The baseline tests established the minimum steps required to determine compliance with Revised 508 standards and WCAG 2.0 Level A and AA. Test instructions that are specific to Trusted Tester only are identified with *TT-specific* or "[no baseline]." The outcomes of these tests will be reflected only in Revised 508 test results.

Baseline test results will be reported separately and are not affected by Trusted Tester-specific tests.

How This Document is Structured

Web Content Tests Only

This test process covers web content only. Trusted Tester versions 4.x and older were developed for the original Section 508 standards, which had separate requirements for web and software. With the Revised 508 standards applying WCAG 2.0 Level A and AA to web, software and other electronic content, combining software and web in one test process was the original plan.

However, because the Revised 508 Standards has other requirements for software in addition to WCAG 2.0, and software testing tools were not yet available, the software test process will be separate. While it is not as common due to HTML5 capabilities, software elements are still found with web content in web applications. To test the software elements, use the software test process.

Similarly, any other operating systems, browser or platforms such as mobile tablets, must be evaluated using other testing procedures.

Testing Order

The numbering of the tests within the test process do not necessarily indicate the order that tests must be performed. Each tester and each application may determine the optimal testing order for coverage and productivity.

It is recommended, however, that the first test performed is Test 1 Conforming Alternate Version. Identifying conforming alternate versions of content helps define the scope of testing and avoids unnecessary testing. Non-conforming content that has a conforming alternate version is excluded from testing.

Test 2 Auto-Playing and Auto-Updating Content and Test 3 Flashing are next in the test process, followed by Test 4 Keyboard Access and Focus. These test WCAG success criteria that are covered in Conformance Requirement 5 Non-Interference. Failure to meet these success criteria could interfere with any use of the page and may indicate critical accessibility issues.

The test process was designed to streamline the sequence for testers, however testers may choose to do the tests (after Test 1) in their preferred order. Each Test Condition (after Test 1) is independent of the tests that precede and follow.

Issues Not Covered in This Test Process

Problems may be found during testing that may affect accessibility, but are simply coding errors and often affect general usability for all users. An example might include a link that leads to the wrong target website. Testers may notify a developer of these issues as a comment on a report, but they do not typically result in a compliance failure as they are beyond the scope of the Revised Section 508 Standards

The Rationale for Each Test

Previous versions of the Trusted Tester process document provided a rationale for each test based on interpretation of the Section 508 standards. With the Section 508 standards refresh and adoption of the WCAG 2.0 Success Criteria, this version of the test process relies principally on the rationale provided in Understanding WCAG 2.0: A guide to understanding and implementing Web Content Accessibility Guidelines 2.0. The test process also relies on accompanying Trusted Tester training to provide additional description and guidance for understanding the logic that drives each test. Each step included in this test process document includes only the information necessary to execute the test. However, the Applicable Standards section of each test references the applicable WCAG 2.0 or Section 508 standard along with a link to the applicable article from the Understanding WCAG 2.0 document.

Test Environment

At the initial release of this document, only the operating systems and browsers specified below were validated with the test process and tools to ensure that results were consistent and accurate. The list of supported operating systems and browsers is expected to grow. Please refer to the Trusted Tester Test Environment Installation and Configuration Guide at https://www.dhs.gov/dhssection-508-compliance-testing-tools for the most up to date test environment information.

Testing Tools

The tools used in the Test Process (and Baseline tests) have been chosen based on several factors including ease of use, ease of teaching, and accuracy of results. They are also free to install and use. The tools assist the tester with verification of accessibility properties. This test process is essentially a code inspection for accessibility properties, but the tools reduce the need for a tester to view source code or have in-depth knowledge of programming languages.

ANDI

ANDI (Accessible Name & Description Inspector) is a free open-source tool to test web content for accessibility. Developed by the US Social Security Administration, ANDI is available at https://www.ssa.gov/accessibility/andi/help/install.html. ANDI is a bookmarklet that can be installed easily on multiple browsers.

ANDI issues may be reported to the ANDI GitHub page: https://github.com/SSAgov/ANDI/issues.

Color Contrast Analyzer

The Color Contrast Analyzer (CCA) is a free open-source tool that displays the contrast ratio for two selected colors. Developed by Steve Faulkner and the Paciello Group, CCA is available at the following links:

Operating Systems

The following operating systems were validated:

Although Windows 10 and macOS are the only Operating Systems listed, no foreseeable issues due to using another operating system have been identified. The operating system has little to no impact on web testing results and is more dependent on the browser.

Browsers

The following browsers were validated:

On Windows 10:

On macOS:

Use of newer versions of these browsers is acceptable unless otherwise specified in the Trusted Tester Test Environment Installation and Configuration Guide at https://www.dhs.gov/dhs-section-508-compliance-testing-tools.

As browsers are frequently updated, it may be possible that an update creates critical issues for test procedures or results. Critical issues and modifications will be published as quickly as possible. Known issues and differences in browser support will be reported in the Trusted Tester Test Environment Installation and Configuration Guide at https://www.dhs.gov/dhssection-508-compliance-testing-tools.

Conformance Reporting Requirements

Test outcomes or results are the primary output from conducting Section 508 conformance testing. Trusted Tester results may have multiple audiences including developers, purchasers, internal IT management personnel, and IT project management. Each audience has different uses for Trusted Tester results so a sufficient set of information must be included to support all audiences to the extent possible. Given that this Trusted Tester process provides a set of evaluations which can be used to determine WCAG 2.0 Level A and AA conformance, Trusted Tester results may also be used outside the U.S. Federal Section 508 conformance scope. However such use, while not incompatible with the Trusted Tester process, is not the primary purpose of this document.

One of the primary objectives of the Section 508 law is to promote improved IT accessibility based on selection of “more accessible” over “less accessible” ICT over time by Federal agencies. Consistent, well documented use of the Accessibility Conformance Report (ACR) (update to the VPAT) format from the IT Industry Council (ITI) supports evaluating overall conformance to make such selections and therefore supports the primary objective of the Section 508 law. Trusted Tester results must be provided at minimum following the Accessibility Conformance Report format from the IT Industry consortium. However, the ACR format must be supplemented with specific Trusted Tester test outcomes, which must then be aggregated to determine the “supported” and “not supported” outcomes for individual WCAG Success Criteria results.

Each ACR must provide:

In general, individual Trusted Test results should provide specific Trusted Test process outcomes: PASS, FAIL, DOES NOT APPLY, or NOT TESTED. For FAIL results should also include clear information identifying the location of the failure and, when feasible, clear information illustrating the content or information that resulted in the FAIL result. When multiple instances of the same failure are found, they may be flagged as global or included individually within test results. The test results must include at minimum a reference to the cross-reference appendix in this document to explain how individual Trusted Test process tests and Baseline tests affect “supported” or “not supported” results for each of the WCAG Success Criteria.

Section 508 Conformance Tests

Each of the Test Conditions included in each test section below is a statement that can be evaluated as TRUE or FALSE. The “How to Test” content included under each Test Condition provides instructions on how to evaluate whether content PASSES the condition:

1. Conforming Alternate Version and Non-Interference

The Conforming Alternate Version Test Conditions deviate slightly from the remaining tests in a few important ways:

Alternate Version Accessibility

Identify Content

Identify multiple versions of the same content, i.e. content that has been provided in more than one way and with the explicit or implicit intent that one or more is provided as an accessible alternative.

  1. Alternate versions may be provided for a part of the page, entire pages, or an entire site.

  2. Various methods may indicate that an alternate version is available, including:

    • Instructions that describe how to enable accessibility

    • Content is identified as the accessible version

    • Multiple methods are provided to complete a task (e.g. a calendar widget and a text field are provided for a user to enter a date)

    • A link’s destination is an accessible alternate version or a version for assistive technology (e.g. screen reader version)

    • User preferences or settings to enable accessibility

    • User controls to modify colors and text appearance

  3. The alternate version(s) does not need to reside within the scope of conformance, or even on the same web site, as long as it is as freely available as the non-conforming version. For this test process, scope of testing is limited to alternatives that are available on a desktop computer. Alternate versions do not include mobile applications that can only be accessed on a mobile device.

If there is only one version of content, the result for the following test ID(s) is DOES NOT APPLY: 1.A to 1.E.

Check that:

Test Name Test ID Test Condition / How to Test
alt-version-conformant 1.A

An alternate version passes all applicable Test Conditions in this test process.

Applicability:

This Test Condition DOES NOT APPLY (DNA) if there is only one version of content.

How to Test:

  1. Enable accessibility with site settings if provided.

  2. If one of the versions has been identified for accessibility purposes, consider that the alternate version and provide test results for that version only in this test ID.

  3. Following this test process, test the identified “accessible” version for all applicable Test Conditions.

    1. It may be necessary to perform extensive testing before a result for this Test Condition can be found. If user settings are provided to enable accessibility for the entire site, the entire site is the alternate version, and the entire site must pass all Test Conditions.

    2. If a failure is found in the identified ‘accessible’ version, enter FAIL for the appropriate Test ID. It is not necessary to continue testing for this test (ID 1.A) after a failure has been found. Evaluate results.

    3. If none of the versions is identified for accessibility, test all versions until a version that passes all applicable Test Conditions is found.

      1. If a failure is found in a version, stop testing that version and test the other version(s).

Evaluate Results:

If the following is TRUE, then the content PASSES; if the following is FALSE, then this Test Condition DOES NOT APPLY (DNA):

  1. There is an accessible alternate version of content that passes all applicable Test Conditions in this test process.

Note:

  • Alternate versions may be provided to accommodate different technology environments or user groups. At least one version would need to pass all Test Conditions.

  • It may be helpful to review product documentation for information about accessible versions or enabling accessibility.

  • If this evaluation is TRUE, continue to 1.B. The accessible alternate version (that passed this 1.A. test) will be referred to as the “accessible version” in subsequent tests. Remove any test results for other (inaccessible) versions of the content from the test report.

  • If this evaluation is FALSE, mark the results for the remaining Conforming Alternate Version Test Conditions as DOES NOT APPLY (DNA). All versions of the content should be tested and included in the test report.

  • While testing for the Alternate Version Test Conditions may halt when the identified “accessible” version has a failed Test Condition, further testing of the version may be necessary to complete the other tests in this test process. When there is no conforming alternative, all versions of the content should be tested and included in the test report.

Alt-version-equivalent 1.B

The accessible version is up to date with the same information and functionality.

Applicability:

This Test Condition DOES NOT APPLY if there is only one version of the content, or if Test ID 1.A was evaluated as DOES NOT APPLY (DNA).

How to Test:

  1. Continue from Test 1.A.

  2. Review the content of the non-conforming version.

  3. Compare the accessible version (that passed Test 1.A) with the non-conforming version for equivalence (i.e., equivalent information and functionality).

Evaluate Results:

If ALL of the following are TRUE, then the content PASSES. If any of the following is FALSE, then this Test Condition DOES NOT APPLY (DNA):

  1. The accessible version (that passed 1.A) provides all of the same information and functionality in the same human language as the non-conforming content, AND

  2. The accessible version (that passed 1.A) is as up to date as the non-conforming content.

Note:

  • The accessible version does not need to be matched page for page with the original (e.g., the accessible version may consist of more or fewer pages).

  • If this evaluation is TRUE, continue to 1.C. The accessible version (that passed this 1.B. test) will be referred to as the “accessible equivalent version” in subsequent tests. Remove any test results for other (inaccessible) versions of the content from the test report.

  • If this evaluation is FALSE, mark the results for this and the remaining Conforming Alternate Version Test Conditions as DOES NOT APPLY (DNA). All versions of the content should be tested and included in the test report.

  • While testing for the Alternate Version Test Conditions may halt when the identified “accessible” version has a failed Test Condition, further testing of the version may be necessary to complete the other tests in this test process. When there is no conforming alternative, all versions of the content should be tested and included in the test report.

Access to the Accessible Alternate Version

Identify Content

A page with the non-conforming content that also provides a mechanism, method, or path for the user to reach the accessible equivalent version of the content (that passed Test 1.B).

Various methods may be used to reach an alternate version, including:

If the test result for 1.A or 1.B is DOES NOT APPLY, the result for the following test ID(s) is DOES NOT APPLY: 1.C to 1.E.

Check that:

Test Name Test ID Test Condition / How to Test
alt-version-access 1.C

The mechanism to reach the accessible equivalent version from the non-conforming page is accessible.

Applicability:

This Test Condition DOES NOT APPLY if there is only one version of content, or if Test ID 1.A or 1.B was evaluated as DOES NOT APPLY (DNA).

How to Test:

  1. Continue from Test 1.B.

  2. Identify the mechanism used to reach the accessible equivalent version (that passed Test 1.B).

    1. If necessary, go back to the page that provided the mechanism to select or enable the accessible version.

    2. This may be a link or some function that is on a non-conforming page

    3. This may be user settings or preferences.

  3. Following this test process, test the mechanism for all applicable Test Conditions.

Evaluate Results:

If the following is TRUE, then the content PASSES. If the following is FALSE, then this Test Condition DOES NOT APPLY (DNA):

  1. The mechanism used to reach the accessible equivalent version passes all applicable Test Conditions.

Note:

  • Applicable tests may include Links and Buttons, Headings, Forms, and/or other tests.

  • The mechanism to access the conforming version should directly or indirectly indicate that it leads to the accessible version. For example, text preceding a link to the accessible version might directly state that the link leads to the accessible version. It may also be possible to “hide” non-conforming content from AT and/or exclude it from keyboard focus, thereby limiting access only to the accessible version for users with disabilities. Such an approach, however, may not be possible, depending on the content.

  • The mechanism may be explicitly provided in the content or may be relied upon to be provided by either the platform or by user agents, including assistive technologies.

  • The mechanism needs to meet all success criteria for the conformance level claimed.

  • If the result for this Test Condition is PASS, it is not necessary to perform Test 1.D and 1.D’s result is ‘Does Not Apply’. An alternate version that has passed Tests 1A and 1B and 1C has met the requirements for conforming alternate versions. Continue to Test 1E.

Access to the Non-Conforming Version

Identify Content

Identify all mechanisms, methods, and paths for a user to reach the non-conforming version(s) of the content.

If the test result for 1.A or 1.B is DOES NOT APPLY, the result for the following test ID(s) is DOES NOT APPLY: 1.D to 1.E.

Check that:

Test Name Test ID Test Condition / How to Test
alt-version-nc-access 1.D

The non-conforming version(s) can only be reached from conforming content.

Applicability:

This Test Condition DOES NOT APPLY if there is only one version of the content, or if Test ID 1.A or 1.B was evaluated as DOES NOT APPLY (DNA).

How to Test:

  1. Identify how the non-conforming version(s) of the content can be accessed

    1. Are there multiple links to the non-conforming version(s) within the site?

    2. Is there also a way to access the accessible equivalent version (that passed Test 1.B) wherever there is a way to access the non-conforming version? Does the page that provides these pass all applicable Test Conditions?

Evaluate Results:

If any of the following is TRUE, then the content PASSES. if the following are all FALSE, then this Test Condition DOES NOT APPLY (DNA):

  1. The non-conforming version can only be reached from the accessible equivalent version (that passed Test ID 1.B), OR

  2. The non-conforming version can only be reached from a page that passes all other Test Conditions and also provides a mechanism to reach the accessible equivalent version (that passed Test ID 1.B).

Note:

  • For Evaluate results step 2b above to be TRUE, both the page and mechanism must pass all applicable Test Conditions. Following this test process, test the page and mechanism for all applicable Test Conditions.

  • An alternate version that has passed Tests 1C or 1D has met the requirements for conforming alternate versions. The result must be PASS, not DNA. Continue to Test 1E.

Non-Interference

Identify Content

The non-conforming version(s) of the content that has a conforming alternate version.

If the test result for 1.A or 1.B is DOES NOT APPLY, the result for the following test ID is DOES NOT APPLY: 1.E.

Check that:

Test Name Test ID Test Condition / How to Test
non-interference 1.E

Content in the non-conforming version(s) meets Conformance Requirement 5.

Applicability:

This Test Condition DOES NOT APPLY if there is only one version of content, or if Test ID 1.A or Test 1.B was evaluated as DOES NOT APPLY (DNA).

How to Test:

  1. The non-conforming content that has a conforming alternate version must be tested prior to omitting the content from the rest of testing.

  2. If necessary and/or applicable, disable accessibility features within site setting or preferences.

  3. Following this test process, complete ONLY the following tests on the non-conforming version(s) of the content:

    1. Test ID 2.A (1.4.2-audio-control)

    2. Test ID 2.B (2.2.2-blinking-moving-scrolling)

    3. Test ID 2.C (2.2.2-auto-updating)

    4. Test ID 3.A (2.3.1-flashing)

    5. Test ID 4.C (2.1.2-no-keyboard-trap)

  4. Note any instances where any of the tests above produces a result of FAIL or NOT TESTED (for Test ID 3.A).

Evaluate Results:

If the following is TRUE, then the content PASSES. If the following is FALSE, then this Test Condition DOES NOT APPLY (DNA):

  1. The results for each of the following tests are PASS or DOES NOT APPLY for the non-conforming version(s) of the content.

    1. Test ID 2.A (1.4.2-audio-control)

    2. Test ID 2.B (2.2.2-blinking-moving-scrolling)

    3. Test ID 2.C (2.2.2-auto-updating)

    4. Test ID 3.A (2.3.1-flashing)

    5. Test ID 4.C (2.1.2-no-keyboard-trap)

Note: After performing this test on the non-conforming version of content that has a conforming alternate version for the content, omit testing of the non-conforming content for the rest of testing.

Applicable Standards

Section 508/WCAG Success Criteria Baseline Requirements
Conforming alternate version is not a requirement. Conformance requirement #1 allows non-conforming pages to be included within the scope of conformance as long as they have a "conforming alternate version". This ensures that all of the information and all of the functionality that is on the pages inside of the scope of conformance is available on conforming Web pages. 20. Conforming Alternate Versions

WCAG Conformance Requirement 5. Non-Interference: The following success criteria apply to all content on the page, including content that is not otherwise relied upon to meet conformance, because failure to meet them could interfere with any use of the page:

  • 1.4.2 - Audio Control,

  • 2.1.2 - No Keyboard Trap,

  • 2.3.1 - Three Flashes or Below Threshold, and

  • 2.2.2 - Pause, Stop, Hide.

25. Non-Interference

2. Auto-Playing and Auto-Updating Content

Auto-Playing Audio

Identify Content

Identify audio content that automatically plays (without user activation) for more than 3 seconds.

If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 2.A.

Check that:

Test Name Test ID Test Condition / How to Test
1.4.2-audio-control 2.A

The user can pause, stop, or control the volume of audio content that plays automatically.

Applicability:

This Test Condition DOES NOT APPLY (DNA) if there is no audio content that plays automatically.

How to Test:

  1. Determine if there is a mechanism at the top of the page for the user to pause or stop the audio or to control the volume of only the auto playing audio.

    1. The browser should already have been configured to disable auto-play. (See Tool Configuration section for instructions.)

  2. Following this test process, test the mechanism for all applicable Test Conditions.

  3. Activate the mechanism.

Evaluate Results:

If ALL of the following are TRUE, then the content PASSES:

  1. There is a mechanism that can pause or stop the audio or control the volume of only the auto-playing audio, AND

  2. The mechanism passes all applicable Test Conditions in this test process.

Moving, Blinking, and Scrolling Content

Identify Content:

Identify visual content that:

Content of this type includes scrolling text, videos, and multimedia.

EXCLUDE content where the movement, blinking, or scrolling of the content is essential.

If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 2.B.

Check that:

Test Name Test ID Test Condition / How to Test
2.2.2-blinking-moving-scrolling 2.B

The user can pause, stop, or hide moving, blinking, or scrolling content.

Applicability:

This Test Condition DOES NOT APPLY (DNA) if there is no moving, blinking, or scrolling content.

How to Test:

  1. Determine if there is a mechanism for the user to pause, stop, or hide the content.

  2. Following this test process, test the mechanism for all applicable Test Conditions.

  3. Activate the mechanism.

Evaluate Results:

If ALL of the following are TRUE, then the content PASSES:

  1. There is a mechanism that can pause, stop, or hide the content, AND

  2. The mechanism PASSES all applicable Test Conditions in this test process.

Auto-Updating Information

Identify Content

Identify content that:

Content of this type includes timers, stock tickers, and counters.

Note:

If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 2.C and 2.D.

Check that:

Test Name Test ID Test Condition / How to Test
2.2.2-auto-updating 2.C

The user can pause, stop, hide, or control the frequency of automatically updating content.

Applicability:

This Test Condition DOES NOT APPLY (DNA) if there is no auto-updating content.

How to Test:

  1. Determine if there is a mechanism for the user to pause, stop, or hide the content or to control the frequency of the update.

  2. Following this test process, test the mechanism for all applicable Test Conditions.

  3. Activate the mechanism.

Evaluate Results:

If ALL of the following are TRUE, then the content PASSES:

  1. There is a mechanism that can pause, stop, or hide the content or control the frequency of the update, AND

  2. The mechanism passes all applicable Test Conditions in this test process.

Notification of Automatic Content Changes

Identify Content

Identify content that changes automatically on the page as part of auto-update.

If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 2.D.

Check that:

Test Name Test ID Test Condition / How to Test
4.1.2-change-notify-auto 2.D

The page provides notification of each automatic update/change in content.

Applicability:

This Test Condition DOES NOT APPLY (DNA) if the page content does not update or change automatically.

How to Test:

  1. Identify how the user is notified of the change in content.

    1. Identify any dialogs that alert the user to changes in content.

      1. Determine whether the dialogs provide sufficient programmatic notification of content changes.

    2. Identify content changes that result in focus moving to the content that has changed.

      1. Determine whether moving the focus to the content that has changed is sufficient to notify the user of the change event (e.g., by describing the change directly in the content to which the focus moved).

    3. Identify content changes occurring in an ARIA Live Region:

      1. Launch ANDI: structures

      2. Click the “live regions” link, then use the mouse to hover over any identified live region (alternatively, use ANDI’s previous/next element buttons to navigate to identified Live Regions).

      3. Determine whether the changing content is contained within the Live Region.

Evaluate Results:

If any of the following is TRUE, the content PASSES:

  1. The page notifies the user about a change via a keyboard-accessible dialog, OR

  2. The page moves focus to the content that has changed, AND the content that has changed provides sufficient description about the change (see Test ID 6.B), OR

  3. The content that has changed is contained in an ARIA Live Region

Note:

  • This is a test for notification of changes to content. The testing of the content that before and after the change are to be performed in other tests. For example, form elements that changed during this test are to be tested per Test ID 5.B.

Applicable Standards

Section 508/WCAG Success Criteria Baseline Requirements

WCAG SC 2.2.2 Pause, Stop, Hide: For moving, blinking, scrolling, or auto-updating information, all of the following are true:

  • Moving, blinking, scrolling: For any moving, blinking or scrolling information that (1) starts automatically, (2) lasts more than five seconds, and (3) is presented in parallel with other content, there is a mechanism for the user to pause, stop, or hide it unless the movement, blinking, or scrolling is part of an activity where it is essential.

  • Auto-updating: For any auto-updating information that (1) starts automatically and (2) is presented in parallel with other content, there is a mechanism for the user to pause, stop, or hide it or to control the frequency of the update unless the auto-updating is part of an activity where it is essential.

  • Note 2: Since any content that does not meet this success criterion can interfere with a user's ability to use the whole page [or software application], all content [in the software or] on the Web page (whether it is used to meet other success criteria or not) must meet this success criterion. See Conformance Requirement 5: Non-Interference.

WCAG SC 1.4.2 Audio Control: If any audio on a Web page plays automatically for more than 3 seconds, either a mechanism is available to pause or stop the audio, or a mechanism is available to control audio volume independently from the overall system volume level.

  • Note: Since any content that does not meet this success criterion can interfere with a user's ability to use the whole page, all content on the Web page (whether it is used to meet other success criteria or not) must meet this success criterion. See Conformance Requirement 5: Non-Interference. Until such time that this test process includes a test for flashing content, no definitive statement can be made regarding Conformance Requirement 5 if any flashing content is present.

21. Timed Events
WCAG SC 4.1.2 Name, Role, Value: For all user interface components (including but not limited to: form elements, links and components generated by scripts), the name and role can be programmatically determined; states, properties, and values that can be set by the user can be programmatically set; and notification of changes to these items is available to user agents, including assistive technologies. 5. Changing Content

3. Flashing

Flashing Content

Identify Content

Visually identify any content that flashes.

When flashing content IS found, the result for Test ID 3.A (Test Name: 2.3.1-flashing) is NOT TESTED.

Note:

If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 3.A.

Applicable Standards

Section 508/WCAG Success Criteria Baseline Requirements

WCAG SC 2.3.1 Three Flashes or Below Threshold: Web pages do not contain anything that flashes more than three times in any one second period, or the flash is below the general flash and red flash thresholds.

  • Note: Since any content that does not meet this success criterion can interfere with a user's ability to use the whole page, all content on the Web page (whether it is used to meet other success criteria or not) must meet this success criterion. See Conformance Requirement 5: Non-Interference. Until such time that this test process includes a test for flashing content, no definitive statement can be made regarding Conformance Requirement 5 if any flashing content is present.

9. Flashing

4. Keyboard Access and Focus

Keyboard Access

Identify Content

Use the mouse or other pointing device to determine available functions provided by interactive elements (including drop-down menus, form fields, revealing/hiding content, tooltips, AND all interactive interface components).

Note:

If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 4.A to 4.I.

Check that:

Test Name Test ID Test Condition / How to Test
2.1.1-keyboard-access 4.A

All functionality can be accessed and executed using only the keyboard.

Applicability:

This Test Condition DOES NOT APPLY (DNA) if the page has no user activated functionality.

How to Test:

  1. Identify the functionality provided by the interactive element.

  2. Use the keyboard to operate the interactive element: access (e.g., tab to) the element and execute (e.g. press Enter with focus on) the element.

  3. If an interactive element does not have keyboard access, determine if there is another keyboard accessible method available on the page which provides the same functionality, e.g. one of two print methods provided is keyboard accessible. [See Conforming Alternate Version for further details.]

Evaluate Results:

If the following is TRUE, the content PASSES:

  1. All functionality can be accessed and executed using the keyboard.

Note:

  • Any changes to functionality that occur automatically or as a result of interaction with the page should be included in this test.

2.1.1-no-keystroke-timing 4.B

Individual keystrokes do not require specific timings for activation of functionality.

Applicability:

This Test Condition DOES NOT APPLY (DNA) if the page has no user activated functionality.

How to Test:

  1. Continue from Test 4.A.

  2. Determine whether there are any instances where the timing of the keystrokes is required to activate the element, e.g. the speed at which a password keystrokes are typed is part of the password authentication.

  3. If a there is a timing dependent functionality, determine if there is another keyboard accessible method available on the page which does not require specific timing.

Evaluate Results:

If the following is TRUE, the content PASSES.

  1. A keyboard method is provided for functionality to be activated without requiring users to perform specific timings for activation.

2.1.2-no-keyboard-trap 4.C

There is no keyboard trap.

Applicability:

This Test Condition DOES NOT APPLY (DNA) if the page has no components that can receive keyboard focus.

How to Test:

  1. Tab through the entire page of keyboard focusable elements.

  2. Determine whether there are any instances where keyboard navigation becomes trapped:

  • Keyboard users are unable to move away from an element, e.g. using a TAB or arrow key

  • Keyboard access is restricted to a small section of the page with no way to navigate out of the “loop” to the rest of the page.

  1. If a keyboard trap is found:

    1. Inspect any help (contextual help, or application help) and documentation for notification of available alternate keyboard commands (e.g., non-standard keyboard controls, access keys, hotkeys).

    2. Determine whether the alternate command(s) work.

Evaluate Results:

If ALL of the following are TRUE, the content PASSES:

  1. Keyboard focus can be moved away from an interactive component using either

    1. Standard navigation keys

    2. Custom keystrokes (which are documented and available to users in the application).

AND

  1. Keyboard focus can be moved away from interactive elements in each section of the page (and are not trapped in a “loop”, preventing access to other interactive elements on the page) by using either

    1. Standard navigation keys

    2. Custom keystrokes (which are documented and available to users in the application)

    </ol>

    Note:

    • In case of a keyboard trap, continue to test interactive elements after the trap by using the mouse to bypass the trap or refreshing the page and using the keyboard to navigate backwards through the page.

    </td> </tr> </tbody> </table>

    Keyboard Access for Title Attribute Tooltips

    Identify Content

    1. Use ANDI: focusable elements > title attributes button to identify all title attributes.

    2. Identify instances where the title attribute provides information that is essential to understanding or operating the page content.

    If there is no such content, the result for the following test ID is DOES NOT APPLY: 4.D.

    Check that:

    Test Name Test ID Test Condition / How to Test
    2.1.1-title-keyboard-access 4.D

    Title attribute information that is essential or required to complete an activity can be accessed using only the keyboard.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if the page has no title attributes or the title attributes do not provide information that is essential to understanding or operating the page content.

    How to Test:

    1. Identify each title attribute that provides information that is essential or required to complete an activity.

    2. Determine if the essential title attribute information is available elsewhere on the page.

    Evaluate Results:

    If the following is TRUE, the content PASSES:

    1. Essential title attribute information is available on the page without the title tooltip alone.

    Note:

    • Title attribute information is considered essential or required when the information in the title attribute is necessary to execute an action or understand information and relationships.

    • Title/tooltip information that is not essential does not require keyboard access.

    • Not all browsers display the title attribute as a tooltip when an element has keyboard focus. Therefore, developers CANNOT rely on the visual display of title attribute information to convey essential information. The title attribute information may not be apparent to a keyboard-only user that is NOT also using a screen reader.

    Focus

    Identify Content

    Use the keyboard to navigate to keyboard-accessible interface components (including drop-down menus, form fields, revealing/hiding content, tooltips, AND all interactive interface components).

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 4.A to 4.I.

    Check that:

    Test Name Test ID Test Condition / How to Test
    2.4.7-focus-visible 4.E

    A visible indication of focus is provided when focus is on the interface component.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if the page has no elements that can receive keyboard focus.

    How to Test:

    1. Continue from Test 4.C.

    2. Determine whether there is a visible indication of focus on the element that has keyboard focus.

    Evaluate Results:

    If the following is TRUE, then the content PASSES:

    • When each interface element receives focus, there is a visible indication of focus.

    3.2.1-on-focus 4.F

    When an interface component receives focus, it does not initiate an unexpected change of context.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if the page has no elements that can receive keyboard focus.

    How to Test:

    1. Continue Test 4.E.

    2. When the interface component receives focus, evaluate whether an unexpected change of context occurs, e.g., a new window is launched, or focus is moved to another interface component.

    Evaluate Results:

    If the following is TRUE, then the content PASSES:

    1. An unexpected change of context is not initiated when an interface component receives focus.

    2.4.3-focus-order-meaning 4.G

    The focus order preserves the meaning and operability of the web page.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if the page has no elements that can receive keyboard focus.

    How to Test:

    1. Use the tab key to move focus through the page.

    2. Determine if the focus order impacts the page meaning (e.g. form fields for a mailing address are presented in the expected sequence).

      1. This is most often noticeable when focus order does not follow the logical order of operation (normally top to bottom, left to right),

      2. For modal dialog boxes, visual focus should remain within the modal dialog box until it is closed.

      3. It may be helpful to launch ANDI: focusable elements and select tab order.

    Evaluate Results:

    If ALL of the following are TRUE, then the content PASSES:

    1. The focus order preserves the meaning of the page, AND

    2. The focus order preserves the operability of the page.

    Note:

    • ANDI tab order markup may be slightly different from actual keyboard tab order in certain browsers. Always use the results from keyboard tab order.

    2.4.3-focus-order-reveal 4.H

    Focus is moved to revealed content.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if no content is revealed when triggered by other elements when navigating or interacting with page content (e.g. submenu options, drop down selections).

    How to Test:

    1. Use the keyboard to activate trigger controls that reveal hidden content (e.g., menus, dialogs, modal dialog boxes, expandable tree list).

    2. Advance the focus through the revealed content using the TAB key.

    1. Visible focus may not move to revealed content that does not contain focusable elements or if the revealed content is not itself focusable. Nevertheless, the content must be navigable by AT using the virtual cursor or system caret in logical sequence. This may not be possible to determine without the use of AT.

    Evaluate Results:

    If any of following is TRUE, then the content PASSES:

    1. Keyboard focus moves directly to revealed content, OR

    2. One additional keystroke moves the focus to revealed content

    Note:

    • For modal dialog boxes, visual focus should remain within the modal dialog box until it is closed. (This is covered by Test 4.G)

    2.4.3-focus-order-return 4.I

    Focus is returned to the logical sequence.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if there are no interactive elements that initiate an action.

    How to Test:

    1. Continue from Test 4.H

    2. If possible to close/hide or dismiss the content, use the keyboard to close/hide and/or move focus out of the revealed content.

    3. Identify the element that has keyboard focus.

      1. It may be necessary to Press the SHIFT + TAB keys or an arrow key to move focus backwards.

    Evaluate Results:

    If any of following is TRUE, then the content PASSES:

    1. Keyboard focus automatically returns to the logical sequence of focus order before the content was revealed, OR

    2. One additional keystroke or keystroke combination returns focus to the logical sequence of focus order before the content was revealed.

    Applicable Standards

    Section 508/WCAG Success Criteria Baseline Requirements

    WCAG SC 2.1.1 Keyboard: All functionality of the content is operable through a keyboard interface without requiring specific timings for individual keystrokes, except where the underlying function requires input that depends on the path of the user's movement and not just the endpoints.

    WCAG SC 2.1.2 No Keyboard Trap: If keyboard focus can be moved to a component of the [content] using a keyboard interface, then focus can be moved away from that component using only a keyboard interface, and, if it requires more than unmodified arrow or tab keys or other standard exit methods, the user is advised of the method for moving focus away.

    1. Keyboard Access
    WCAG SC 2.4.7 Focus Visible: Any keyboard operable user interface has a mode of operation where the keyboard focus indicator is visible. 2. Focus Visible

    WCAG SC 2.4.3 Focus Order: If a Web page can be navigated sequentially and the navigation sequences affect meaning or operation, focusable components receive focus in an order that preserves meaning and operability.

    WCAG 3.2.1 - On Focus: When any component receives focus, it does not initiate a change of context.

    3. Focus Order

    5. Forms

    Form Components

    Identify Content

    1. Use ANDI to identify any form elements on the page, e.g., text fields, radio buttons, checkboxes, read-only fields, and multi-select lists.

    2. Find all instructions and cues (textual and graphical) that are related to form components/controls, e.g., groupings, order of completion, special conditions, qualifiers, format instructions.

    EXCLUDE disabled input elements. These do not receive keyboard focus, cannot be selected and cannot be modified.

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 5.A to 5.G.

    Check that:

    Test Name Test ID Test Condition / How to Test
    3.3.2-input-instructions 5.A

    Labels and instructions for each form input inform users what input data is expected and, if applicable, what format is required.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if the page does not have form elements.

    How to Test:

    1. Review the labels and instructions for each form field identified.

    2. Determine whether labels and instructions for input fields provide purpose and applicable data requirements (date formats, required fields, data type, etc.).

    Evaluate Results:

    If the following is TRUE, then the content PASSES:

    1. Instructions are provided so users know what input data is expected.

    Note:

    • The association of the form instructions (text label) to the form field is tested in 5.B for 1.3.1-form-labels-cues.

    • An error message is not sufficient to communicate the expected format to pass this test.

    • Any changes to form labels that occur automatically or as a result of interaction with the page should be included in this test.

    1.3.1-form-labels-cues 5.B

    The combination of the accessible name, accessible description, and other programmatic associations (e.g., table column and/or row associations) describes each input field and includes all relevant instructions and cues (textual and graphical).

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if the page does not have form elements.

    How to Test:

    1. Launch ANDI: focusable elements (this is the default selection).

    2. Use the mouse or ANDI’s next/previous element buttons to highlight each focusable form element and review the ANDI output.

    3. Review the ANDI output for each focusable form field.

    4. Review other programmatic associations, such as table headings or location in a hierarchical list structure, to determine whether they provide or contribute to the form field’s description, cues, or instructions.

    Evaluate Results:

    If any of the following is TRUE, then the content PASSES:

    1. The ANDI Output matches the visible instructions and cues on the page for the form element, including when fields are required, OR

    2. Labels and cues are provided by other programmatic associations (e.g., table column and/or row associations).

    Note:

    • This test also covers the requirement for WCAG SC 4.1.2 Name, Role, Value.

    • Any changes to form elements that occur automatically or as a result of interaction with the page should be included in this test.

    • To evaluate labels and cues provided by other programmatic associations, it may be necessary to perform other tests, including but not limited to 14. Tables and 10. Content Structure.

    • At minimum, radio buttons and checkboxes should be programmatically associated with their questions.

    • Form fields are not required to have programmatic associations with form section headings unless there is significant risk of confusion.

    3.2.2-on-input 5.C

    Changing field values/selections (e.g., entering data in a text field, changing a radio button selection) does NOT initiate an unexpected change of context.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if the page does not have form elements.

    How to Test:

    1. Use the keyboard to navigate to form elements, e.g. text fields, radio buttons, checkboxes, buttons.

    2. Complete the form element, e.g., select the radio button or check box, type information into the text box, select an item from the drop down.

    3. Exit (tab away from) the completed form element and determine whether there are any instances of an unexpected change of context.

    4. Changes in context include changes of: user agent, viewport, focus, content that changes the meaning of the page, e.g., a form is automatically submitted when exiting a field, a new window is launched when a radio button is selected.

      1. Note: A change is not considered unexpected if:

        1. The user is notified that a change of context is about to occur.

        2. The control is clearly intended to initiate a change in context when activated.

    Evaluate Results:

    If the following is TRUE, then the Test Condition is TRUE and the content PASSES:

    1. Changing the value of a form element does not initiate an unexpected context change.

      Note:

    • For some types of form fields, such as text input fields, it may be necessary to move focus away from the field to trigger an input event.

    4.1.2-change-notify-form 5.D

    The page provides notification of each form-related change in content.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if the page does not have form elements or if the page content does not change due to form interaction.

    How to Test:

    1. Continue from Test 5.C.

    2. If necessary, repeat the interactions that trigger changes to content of the page (instructions changed, error notification, content removed, content is added, etc.).

    3. Identify how the user is notified of the change in content.

      1. Determine whether the form element that triggers the change has an accessible name, accessible description and/or context that provides sufficient description of the interface component’s purpose.

        1. If content changes are the direct result of a user's action while interacting with content AND the interface component that triggers the change provides sufficient description of the change, then no additional programmatic event notification is necessary.

      2. Identify any dialogs that alert the user to changes in content.

        1. Determine whether the dialogs provide sufficient programmatic notification of content changes.

      3. Identify content changes that result in focus moving to the content that has changed.

        1. Determine whether moving the focus to the content that has changed is sufficient to notify the user of the change event (e.g., by describing the change directly in the content to which the focus moved).

      4. Identify content changes occurring in an ARIA Live Region:

        1. Launch ANDI: structures

        2. Click the “live regions” link, then use the mouse to hover over any identified live region (alternatively, use ANDI’s previous/next element buttons to navigate to identified Live Regions).

        3. Determine whether the changing content is contained within the Live Region.

    Evaluate Results:

    If any of the following is TRUE, then the Test Condition is TRUE and the content PASSES:

    1. The user’s action directly results in the change in content, AND the interface component that triggered the change provided sufficient description about the change event, OR

    2. The page notifies the user about a change via a keyboard-accessible dialog, OR

    3. The page moves focus to the content that has changed, AND the content that has changed provides sufficient description about the change, OR

    4. The content that has changed is contained in an ARIA Live Region.

    Note:

    • All form elements that changed during this test are to be tested per Test ID 5.B.

    • All revealed content must be tested per Test 4.G, 2.4.3-focus-order-reveal.

    • It may be necessary to use the mouse to determine whether state changes occur on hover or on click.

    • Depending on the component, a change of state may be triggered by various actions, such as changing values or states of other components, toggling a function, entering data in the component, mouseover, etc.

    Input Error Identification and Suggestions

    Identify Content

    Identify all automatic input error detection, error notifications, error suggestions, and related instructions:

    1. Use ANDI to identify any form elements on the page.

    2. Find all instructions and cues (textual and graphical) that are related to form components/controls, e.g., groupings, order of completion, special conditions, qualifiers, format instructions.

    3. Intentionally enter values and/or make selections that violate format and/or other form instructions

    If there is no automatic input error detection, the result for the following test ID(s) is DOES NOT APPLY: 5.E and 5.F.

    Check that:

    Test Name Test ID Test Condition / How to Test
    3.3.1-error-identification 5.E

    The item in error is identified and the error is described to the user in text.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if the page does not have automatic error detection.

    How to Test:

    1. Intentionally violate formatting and other form instructions, e.g., leave a required form field empty, use a different date format than is required, and/or create a password that does not meet the password strength requirements.

    2. Attempt to submit the form and/or move to the next page.

    3. Determine whether the error is identified and described in text.

      1. The form field with the error is visually identified, e.g. the text field outline becomes bold red.

      2. Text appears identifying the form field with the error, e.g., in a dialog message, on the page.

    Evaluate Results:

    If ALL of the following are TRUE, then the Test Condition is TRUE and the content PASSES:

    1. The item that is in error is identified, AND

    2. The error is described to the user in text.

    Note:

    • The error message may be tested as part of Changing Content 4.1.2-change-notify (Test ID 6.A)

    3.3.3-error-suggestion 5.F

    Additional guidance (e.g., suggestion for corrected input) is provided about how to correct errors for form fields.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if any of the following conditions apply to the page:

    1. There is no automatic input error detection.

    2. Based on the type of input required, suggestions for correction cannot be provided because they are not knowable.

    3. Providing information about how to correct the error would jeopardize the security or purpose of the content, e.g. details about an incorrect password.

    How to Test:

    1. Continue from Test 5.D.

    2. Determine whether additional guidance provides sufficient details for how to correct the error and/or offers suggestions of corrected input.

    Evaluate Results:

    If ALL of the following are TRUE, then the Test Condition is TRUE and the content PASSES:

    1. Suggestions for corrected input are provided, AND

    2. The description contains adequate information for the user to know what is required to fix the error.

    Input Error Prevention

    Identify Content

    Identify Content that

    • Submits user form entries that result in or causes legal commitments or financial transactions

    • Submits user form entries that modify or deletes user-controllable data in a data storage system

    • Submits user test responses

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 5.G.

    Check that:

    Test Name Test ID Test Condition / How to Test
    3.3.4-error-prevention 5.G

    The web page allows the user to check, reverse, and/or confirm submission.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if the page does not do any of the following upon submission:

    • Cause legal or financial obligations.

    • Modify or delete user-controlled data in data storage systems.

    • Submit test responses.

    How to Test:

    1. Complete the required form fields with intentional errors and submit the content.

    Evaluate Results:

    If any of the following is TRUE, the content PASSES:

    1. The user can reverse the submission, OR

    2. The user is presented with an option to review, confirm, and correct information before finalizing the submission, OR

    3. The page checks data for input errors and allows the user an opportunity to correct any errors.

    Applicable Standards

    Section 508/WCAG Success Criteria Baseline Requirements

    WCAG SC 1.3.1 Info and Relationships: Information, structure, and relationships conveyed through presentation can be programmatically determined or are available in text.

    WCAG SC 3.2.2 On Input: Changing the setting of any user interface component does not automatically cause a change of context unless the user has been advised of the behavior before using the component.

    WCAG SC 3.3.1 Error Identification: If an input error is automatically detected, the item that is in error is identified and the error is described to the user in text.

    WCAG SC 3.3.2 Labels or Instructions: Labels or instructions are provided when content requires user input.

    WCAG SC 3.3.3 Error Suggestion: If an input error is automatically detected and suggestions for correction are known, then the suggestions are provided to the user, unless it would jeopardize the security or purpose of the content.

    WCAG SC 3.3.4 Error Prevention (Legal, Financial, Data): For Web pages [or software] that cause legal commitments or financial transactions for the user to occur, that modify or delete user-controllable data in data storage systems, or that submit user test responses, at least one of the following is true:

    1. Reversible: Submissions are reversible.

    2. Checked: Data entered by the user is checked for input errors and the user is provided an opportunity to correct them.

    3. Confirmed: A mechanism is available for reviewing, confirming, and correcting information before finalizing the submission.

    10. Forms
    WCAG SC 4.1.2 Name, Role, Value: For all user interface components (including but not limited to: form elements, links and components generated by scripts), the name and role can be programmatically determined; states, properties, and values that can be set by the user can be programmatically set; and notification of changes to these items is available to user agents, including assistive technologies. 5. Changing Content

    Identify Content

    Use ANDI: links/buttons to identify all links and buttons.

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 6.A and 6.B.

    Check that:

    Test Name Test ID Test Condition / How to Test
    2.4.4-link-purpose 6.A

    The purpose of each link or button can be determined from any combination of the link/button text, accessible name, accessible description, and/or programmatically determined link/button context.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if the page does not have links or buttons.

    How to Test:

    1. Evaluate the ANDI Output for link/button purpose.

    2. Determine whether the ANDI Output, in combination with the programmatically determined link/button context (text that is in the same paragraph, list, or table cell as the link/button or in a table header cell that is associated with the table cell that contains the link/button) adequately describes the link/button’s purpose or function.

      1. In cases where the purpose of the link/button is intentionally vague or ambiguous (e.g., the content to be revealed after selecting a link to “Door 1,” “Door 2,” or “Door 3” is intended to be a surprise), it may be sufficient for the combination of link/button text, accessible name, accessible, description, and/or link/button context to refer to the link/button purpose vaguely or ambiguously.

    Evaluate Results: if the following is TRUE, then the content PASSES:

    1. The combination of the programmatically determined link/button context and the ANDI Output provide adequate description of the link/button’s purpose.

    Note:

    • This test also covers the requirement for WCAG SC 4.1.2 Name, Role, Value.

    • Any changes to links or buttons that occur automatically or as a result of interaction with the page should be included in this test.

    4.1.2-change-notify-links 6.B

    The page provides notification of each change in content that is the result of interaction with a link or button.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if the page does not have links or buttons or if the page content does not change due to link or button interaction.

    How to Test:

    1. Activate the link or button to trigger changes to page content (e.g., content removed, content is added, etc.).

    2. Identify how the user is notified of the change in content.

      1. Determine whether the link or button that triggers the change has an accessible name, accessible description and/or context that provides sufficient description of the interface component’s purpose.

        1. If content changes are the direct result of a user's action while interacting with content AND the interface component that triggers the change provides sufficient description of the change, then no additional programmatic event notification is necessary.

      2. Identify any dialogs that alert the user to changes in content.

        1. Determine whether the dialogs provide sufficient programmatic notification of content changes.

      3. Identify content changes that result in focus moving to the content that has changed.

        1. Determine whether moving the focus to the content that has changed is sufficient to notify the user of the change event (e.g., by describing the change directly in the content to which the focus moved).

      4. Identify content changes occurring in an ARIA live region:

        1. Launch ANDI: structures

        2. Click the “live regions” link, then use the mouse to hover over any identified live region (alternatively, use ANDI’s previous/next element buttons to navigate to identified live regions).

        3. Determine whether the changing content is contained within the live region.

    Evaluate Results: if any of the following is TRUE, then the Test Condition is TRUE and the content PASSES:

    1. The user’s action directly results in the change in content, AND the interface component that triggered the change provided sufficient description about the change event, OR

    2. The page notifies the user about a change via a keyboard-accessible dialog, OR

    3. The page moves focus to the content that has changed, AND the content that has changed provides sufficient description about the change, OR

    4. The content that has changed is contained in an ARIA live region.

    Applicable Standards

    Section 508/WCAG Success Criteria Baseline Requirements
    WCAG SC 2.4.4 Link Purpose (In Context): The purpose of each link can be determined from the link text alone or from the link text together with its programmatically determined link context, except where the purpose of the link would be ambiguous to users in general. 14. Links
    WCAG SC 4.1.2 Name, Role, Value: For all user interface components (including but not limited to: form elements, links and components generated by scripts), the name and role can be programmatically determined; states, properties, and values that can be set by the user can be programmatically set; and notification of changes to these items is available to user agents, including assistive technologies. 5. Changing Content

    7. Images

    Images

    Identify Content

    Use the ANDI: graphics/images module to find the images. Start with the first image outlined by ANDI: graphics/images.

    Use the “Focus on next element” button to find all images on the page. For each image found by ANDI: graphics/images, determine if the image is meaningful or decorative.

    If there are no such content, the result for the following test ID(s) is DOES NOT APPLY: 7.A to 7.B.

    For Meaningful Images – Check that:

    Test Name Test ID Test Condition / How to Test
    1.1.1-meaningful-image-name 7.A

    The accessible name and accessible description for a meaningful image provides an equivalent description of the image.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if there are no meaningful images on the page.

    How to Test:

    1. Review the ANDI Output for the meaningful image.

      1. If the image is used as a CAPTCHA, ANDI Output describes the purpose of the CAPTCHA

      2. If the image is of meaningful text, ANDI Output contains the same text.

      3. If the ANDI Output points to page content for the image’s description, determine whether the description is provided.

    Evaluate Results:

    If the following is TRUE, then content PASSES:

    1. The ANDI Output contains the equivalent description for the meaningful image and/or refers to a description in the page content.

    Note:

    • Any changes to meaningful images that occur automatically or as a result of interaction with the page should be included in this test.

    • Notification of automatic changes are tested in Test 2.D.

    • Notification of changes as a result of interaction with other content are tested either in Test 5.D or 6.B.

    • An image that is on the page but not detected by ANDI as described in Identify Content should not be included in this test.

    • The ANDI Output is empty for an image that has role=”presentation” or aria-hidden=”true”. These cannot be set for meaningful images, which must have an equivalent description in ANDI Output.

    For Decorative Images – Check that:

    Test Name Test ID Test Condition / How to Test
    1.1.1-decorative-image 7.B

    There is no accessible name and accessible description for a decorative image.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if there are no decorative images on the page.

    How to Test:

    1. Review the ANDI Output for the decorative image.

    Evaluate Results:

    if the following is TRUE, then the content PASSES:

    1. The ANDI Output for a decorative image is blank.

    Note:

    • An image that is on the page but not detected by ANDI as described in Identify Content should not be included in this test. Some decorative images are background images and should be tested and reported in 7.C.

    • Any changes to decorative images that occur automatically or as a result of interaction with the page should be included in this test.

    • The ANDI Output is empty for an image that has role=”presentation” or aria-hidden=”true”. These are two techniques to hide an image from assistive technology and are appropriate for decorative content.

    CSS Background Images

    Identify content

    1. Use the ANDI: graphics/images module to find CSS background images. If the “find background” and “hide background” buttons are available, background images are on the page.

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 7.C.

    Check that:

    Test Name Test ID Test Condition / How to Test
    1.1.1- decorative-background-image 7.C

    The background image is not the only means used to convey important information.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if there are no decorative background images on the page.

    How to Test:

    1. Select the “find background” button in ANDI: graphics/images to outline all background images (in green).

    2. Find the outlined background images within the page. (ANDI will not display information for background images.)

    3. Determine whether important information provided by the background image is available without the background image.

      1. Select the “hide background” function in ANDI: graphics/images to hide background images and help determine if the image’s information is also available on the page without the background image.

      2. Review the sequence or positioning of the image to determine whether equivalent information is presented in the same logical order.

    Evaluate Results:

    If any of the following is TRUE, then the content PASSES:

    1. The background image is decorative, OR

    2. The meaning of the background image is also available without the background image.

    Note:

    • Any changes to meaningful background images that occur automatically or as a result of interaction with the page should be included in this test.

    CAPTCHA Images

    Identify content

    Identify all CAPTCHA images

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 7.D.

    Check that:

    Test Name Test ID Test Condition / How to Test
    1.1.1-captcha-alternative 7.D

    Alternative forms of CAPTCHA are provided.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if there are no CAPTCHA images on the page.

    How to Test:

    1. Determine whether alternative forms of CAPTCHA with output modes for different types of sensory perception are provided.

    Evaluate Results:

    If ALL of the following are TRUE, then the content PASSES:

    1. The CAPTCHA has a format for users without vision, AND

    2. The CAPTCHA has a format for users without hearing.

    Images of Text

    Identify content

    Identify all images of text

    EXCLUDE text that is part of a picture that contains significant other visual content such as graphs, screenshots, and diagrams, which visually convey important information more than just text.

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 7.E.

    Check that:

    Test Name Test ID Test Condition / How to Test
    1.4.5-image-of-text 7.E

    The image of text cannot be replaced by text or is customizable.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if there are no images of text on the page.

    How to Test:

    1. Determine if the image of text can be visually customized: adjust the font, size, color and background with controls provided by the web page.

      1. Customizing font size for an image of text also implies the ability to adjust the size without pixelation (which is typically evident when simply using the browser resize functionality to resize images).

    2. Determine if text can be used instead of the image of text to present the same effect and information.

      1. Logotypes (text that is part of a logo or brand name) cannot be replaced by text.

      2. Type samples, branding, images of specific fonts that are not widely supported are additional examples of images of text that cannot be replaced by text.

    Evaluate Results:

    If any of the following is TRUE, then the content PASSES:

    1. The image of text cannot be replaced with text, OR

    2. The image of text can be visually customized.

    Applicable Standards

    Section 508/WCAG Success Criteria Baseline Requirements

    WCAG SC: 1.1.1. Non-Text: All non-text content that is presented to the user has a text alternative that serves the equivalent purpose, except for [specific] situations listed.

    WCAG SC: 1.4.5 Images of Text: If the technologies being used can achieve the visual presentation, text is used to convey information rather than images of text except for [specific situation listed].

    4. Images
    WCAG SC: 1.1.1. Non-Text: All non-text content that is presented to the user has a text alternative that serves the equivalent purpose, except for [specific] situations listed. 18. CSS Content and Positioning
    WCAG SC 4.1.2 Name, Role, Value: For all user interface components (including but not limited to: form elements, links and components generated by scripts), the name and role can be programmatically determined; states, properties, and values that can be set by the user can be programmatically set; and notification of changes to these items is available to user agents, including assistive technologies. 5. Changing Content

    8. Adjustable Time Limits

    Timing Adjustable

    Identify Content

    Identify any instances of content time limits.

    Time limits could be identified by:

    • Inspecting system or site documentation

    • Text description somewhere on the page where the time limit occurs

    • Pop-ups or other messages or warning indicators on the page

    • Allowing the page to be idle for an extended period of time to prompt a time-out notification or other indication that a time limit has occurred.

    EXCLUDE:

    • Real-time Exception: The time limit is a required part of a real-time event (for example, an auction), and no alternative to the time limit is possible; or

    • Essential Exception: The time limit is essential and extending it would invalidate the activity; or

    • 20 Hour Exception: The time limit is longer than 20 hours.

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 8.A

    Check that:

    Test Name Test ID Test Condition / How to Test
    2.2.1-timing-adjustable 8.A

    The user can turn off, adjust, or extend the time limit.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if there is no time limit for content or if the time limit meets one of the exceptions listed in the Identify Content section above.

    How to Test:

    1. Determine whether the web page provides a way to turn off, adjust, or extend the time limit

    Evaluate Results:

    If any of the following is TRUE, then the content PASSES.

    1. The user can turn off the time limit before time expires, OR

    2. The user can adjust the time limit to at least ten times the length of the default setting before time expires, OR

    3. The page provides a warning before time expires AND:

      1. For a period of at least 20 seconds, the user can extend the time limit with a simple action (e.g., pressing the spacebar), AND

      2. The user can extend the time limit at least ten times.

    Applicable Standards

    Section 508/WCAG Success Criteria Baseline Requirements

    WCAG SC 2.2.1 Timing Adjustable: For each time limit that is set by the content, at least one of the following is true:

    • Turn off: The user is allowed to turn off the time limit before encountering it.

    • Adjust: The user is allowed to adjust the time limit before encountering it over a wide range that is at least ten times the length of the default setting.

    • Extend: The user is warned before time expires and given at least 20 seconds to extend the time limit with a simple action (for example, “press the space bar”), and the user is allowed to extend the time limit at least ten times.

    21. Timed Events

    9. Repetitive Content

    Repetitive Content – Bypass

    Identify Content

    Identify block(s) of content that are repeated on other pages within the site.

    • Blocks of content that are repeated on other pages may include navigation links, page headers, tabs, and banners.

    • Blocks of content do not have to be exactly the same to be considered repetitive; blocks of content could be considered to repeat if they contain the same type of information and/or serve the same purpose.

    EXCLUDE small sections, such as repeated individual words, phrases, or single links. They are not considered repetitive blocks of content.

    Note:

    • Most web browsers provide keyboard shortcuts to move the user focus to the top of the page or browser; providing a "skip" link may be unnecessary if a set of navigation links is provided at the bottom of a web page.

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 9.A

    Check that:

    Test Name Test ID Test Condition / How to Test
    2.4.1-bypass-function 9.A

    A keyboard-accessible method is provided to bypass repetitive content.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) to a web page that does not contain blocks of content that are repeated on other web pages.

    How to Test:

    1. Starting at the top of the page, use standard keyboard commands to navigate forward to repetitive blocks of content. (Note: that some bypass functions may not be visible until they receive focus.)

    2. Launch ANDI: focusable elements to check for skip links, hide options, collapse menu and other elements with similar bypass functionality.

      1. Note: ANDI’s “tab order” feature under the focusable elements module may help evaluate the order in which bypass methods occur relative to other content.

      2. Alternatively, launch ANDI: links/buttons and click the “show links list” button.

    3. Determine whether a keyboard-accessible method was provided to bypass repetitive content (e.g. skip links, hotkeys, scripted elements, etc. Frames may work as a bypass method in some browsers but not others.).

    4. Use standard keyboard commands to activate the bypass function.

      1. Multiple blocks of repeated content may require multiple methods to bypass the blocks; it may not be possible to bypass all blocks of repeated content with a single method.

      2. Moving focus past blocks of repeated content may not always be visibly evident if there are no focusable elements directly after the bypassed block.

    Evaluate Results:

    If ALL of the following are TRUE, then the content PASSES:

    1. There is a keyboard-accessible method provided to bypass repetitive content, AND

    2. When activated, the method works, and the block of content is bypassed.

    Note:

    • If there is no interactive component to receive the shift of focus, it may not be evident via visual indication of focus that a focus shift occurred. Reducing the browser height may make a focus shift more obvious.

    Repetitive Content – Navigation

    Identify Content

    Identify all navigational elements that are repeated on multiple pages within the website.

    Note: Navigational elements are any components that provide a user the ability to locate specific information or functionality across the website. These can be static or interactive elements, and groupings of components can also meet this definition.

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 9.B

    Check that:

    Test Name Test ID Test Condition / How to Test
    3.2.3-consistent- navigation 9.B

    Each navigational element occurs in the same relative order with regard to other repeated components on each web page where it appears.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) to a web page that does not contain components that are repeated on other web pages.

    How to Test:

    1. Review multiple web pages of the web site to identify navigational components that are repeated on multiple pages. Do not initiate changes to the content.

    2. Review the order of the navigational elements and compare it to the order on the other pages where they appear

      1. Note: ANDI’s “tab order” feature under the focusable elements module may help evaluate the focus order of interactive interface components. ANDI’s “reading order” feature under the structure module may also help evaluate the content order of both focusable and non-focusable components.

    Evaluate Results:

    If the following is TRUE, then the Test Condition is TRUE and the content PASSES:

    1. Each repeated component occurs in the same relative order with regard to other repeated components on each web page where it appears.

    Note:

    • Same relative order is defined as same position relative to other items. Items are considered to be in the same relative order even if other items are inserted or removed from the original order. For example, expanding navigation menus may insert an additional level of detail or a secondary navigation section may be inserted into the reading order.

    Repetitive Content – Identification

    Identify Content

    Identify components that have the same functionality within the page and/or within a set of web pages.

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 9.C.

    Check that:

    Test Name Test ID Test Condition / How to Test
    3.2.4-consistent-identification 9.C

    The accessible name and description is consistent for components that perform the same function.

    Applicability:

    This Test Condition only applies to components that have the same functionality within a web page or within a set of web pages.

    How to Test:

    1. Launch ANDI: focusable elements.

    2. Examine the ANDI Output for each identified element.

    Evaluate Results:

    If the following is TRUE, then the Test Condition is TRUE and the content PASSES:

    1. Components with identical functionality are identified consistently.

    Note:

    • Consistent text alternatives for interface elements that perform the same function are not always truly “identical.” This is acceptable if they follow a consistent format. For instance, in the use of a graphical arrow at the bottom of a web page that links to the next web page, the text alternative may be: “Go to page 4.” However, the same arrow image on the next page should then state "Go to page 5."

    • A single non-text-content-item may be used to serve different functions. In such cases, different text alternatives are necessary and should be used. Examples can be commonly found with the use of icons such as check marks, cross marks, and traffic signs. Their functions can be different depending on the context of the web page. A check mark icon may function as “approved”, “completed”, or “included”, to name a few, depending on the situation. Using “check mark” as text alternative across all web pages does not help users understand the function of the icon. Different text alternatives can be used when the same non-text content serves multiple functions. (Understanding SC 3.2.4)

    Applicable Standards

    Section 508/WCAG Success Criteria Baseline Requirements

    WCAG SC 2.4.1 Bypass Blocks: A mechanism is available to bypass blocks of content that are repeated on multiple Web pages.

    WCAG SC 3.2.3 Consistent Navigation: Navigational mechanisms that are repeated on multiple Web pages within a set of Web pages occur in the same relative order each time they are repeated, unless a change is initiated by the user.

    WCAG 3.2.4 Consistent Identification: Components that have the same functionality within a set of Web pages are identified consistently.

    4. Repetitive Content

    10. Content Structure

    Headings

    Identify Content

    1. Identify all visually apparent headings, which denote sections of content.

      1. Headings are often in a larger, bolded font separated from paragraphs by extra spacing (though not always). Note the hierarchy and structure of each heading with respect to other headings on the page.

    2. Use ANDI to identify all programmatically defined headings: <h1> to <h6> or ARIA role=”heading”.

      1. Launch ANDI: structures.

      2. Select the "headings" button within ANDI: structures.

      3. ANDI will add dotted outlines around each identified heading that is visible on the page.

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 10.A to 10.C.

    Check that:

    Test Name Test ID Test Condition / How to Test
    2.4.6-heading-purpose 10.A

    Each heading describes the topic or purpose of its content.

    Applicability:

    This Test Condition DOES NOT APPLY if there are no visual headings on the page.

    How to Test:

    1. For each visually identified heading, compare the heading text to the content beneath the heading.

    Evaluate Results:

    If the following is TRUE, then the Test Condition is TRUE and the content PASSES:

    1. The heading describes the topic or purpose of its content.

    1.3.1-heading-determinable 10.B

    Each programmatically determinable heading is a visual heading and each visual heading is programmatically determinable.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) to web pages where programmatic headings are not identified by ANDI and/or where there are no visual headings on the page.

    How to Test:

    1. Select ANDI: structures and review the ANDI Output for each visually apparent heading. ANDI outlines all headings with a dotted purple line.

      1. If ANDI does not identify a visually apparent heading, then the heading is not defined programmatically.

    2. Review each heading identified by ANDI to determine if it is also a visually apparent heading.

    Evaluate Results:

    If ALL of the following are TRUE, then the content PASSES:

    1. Each programmatically determinable heading is serving as a visual heading on the page, AND

    2. Each visual heading is programmatically defined.

    Note:

    • Content that is not a visual heading should not have a role of heading (for example, heading markup should not be used for emphasis on an element that is not a heading for content after it). Conversely, content that is styled to look like and function like a heading should be programmatically defined as heading.

    1.3.1-heading-level 10.C

    Programmatic heading levels logically match the visual heading presentation within the heading structure.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) to web pages where programmatic headings are not identified by ANDI.

    How to Test:

    1. Launch ANDI: structures and select the “view outline” button to display the Structure Outline.

    2. Mouse over or tab through each of the headings in ANDI’s Structure Outline to review the ANDI Output for each heading.

      1. ANDI will identify heading level conflicts if found. Ex: “Heading element level <h1> conflicts with [aria-level=”2”].”

    3. Compare the heading levels listed in the Structure Outline to the page content. Determine whether the heading levels logically match the visual heading presentation within the heading structure.

      1. On pages that have only one heading, that heading can have any heading level, as the page’s heading level structure is defined by that one heading.

      2. The most important heading(s) should have the highest priority level. For example, heading level 1 is a higher level than heading level 2, which is higher than heading level 3.

      3. Headings with an equal or higher level start a new section; headings with a lower level start new subsections that are part of the higher leveled section.

      4. A heading level 1:

        • Is not required

        • Can be used more than once on a page

        • Is not required to match the page title

      5. The level of headings may not always be in sequence but may be valid as it relates to the visual structure/importance communicated by visible headings on the page. For example, an <h2> heading may be used for a navigation structure that precedes an <h1> title on a page. It is also acceptable to have <h3> then <h5> without an <h4> in between.

    Evaluate Results:

    If ALL of the following are TRUE, then the content PASSES:

    1. Every programmatically identified heading level logically matches the visual heading structure on the page, AND

    2. There is no heading level conflict.

    Lists

    Identify Content

    Identify all visually apparent lists on the page.

    Note:

    • Developers may use list elements to help present grouped items, such as menus and submenus, while styling them to remove bullets/numbering and sometimes orient the lists horizontally instead of vertically. Such use of list elements is acceptable and consistent with the Test Condition defined below.

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 10.D.

    Check that:

    Test Name Test ID Test Condition / How to Test
    1.3.1-list-type 10.D

    All visually apparent lists are programmatically identified according to their type.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if there are no visually apparent lists.

    How to Test:

    1. Launch ANDI: structures and select the “lists” button.

    2. Review the information under “List Elements,” noting the number of lists identified and their types.

    3. For each list, visually determine the type of list; determine if it appears to be ordered, unordered, or a description list.

      • Ordered - numbered sequentially and, if necessary, hierarchically (e.g., 1, 2, 2.a, 2.a.i, etc.) and are used where sequence or the ability to reference specific items by number/letter are important.

      • Unordered - not numbered and are used where a specific sequence or the ability to reference specific items by number/letter are not important.

      • Description list (dl) - used to groups terms with their descriptions.

    4. Review the visual representation of list relationships, including order, hierarchy, and nesting compared to the programmatic list definitions presented via the ANDI output.

      1. It is possible to provide any number of nested list combinations using ordered, unordered, and definition lists. ANDI identifies each nested list separately. Review each list to determine if the visual nesting and relationship matches the programmatic nesting and relationships.

    Evaluate Results:

    If ALL of the following are TRUE, then the content PASSES:

    1. All content that has the visual appearance of a list is defined programmatically as a list, according to the type of list.

      1. An unordered list (with or without bullets) is marked as an unordered list (ul).

      2. An ordered list is marked as an ordered list (ol).

      3. Terms and their descriptions that are presented in the form of a list are marked as a description list (dl)

        AND

    2. All programmatic list relationships, including nesting and hierarchies, are consistent with the list relationships presented visually.

    Note:

    • Not all lists require markup. For instance, a list of items in a sentence, separated by commas, do not have to be included in a bulleted or numbered list.

    Applicable Standards

    Section 508/WCAG Success Criteria Baseline Requirements

    WCAG SC 2.4.6 Headings and Labels: Headings and labels describe topic or purpose.

    WCAG SC 1.3.1 Info and Relationships: Information, structure, and relationships conveyed through presentation can be programmatically determined or are available in text.

    13. Content Structure

    11. Language

    Language of Page

    Identify Content

    1. Identify all pages with text.

    2. Review the page content to identify the default human language of the page (the language in which most of the page is presented).

    Note:

    • “Human language” refers to the language that people use to communicate with one-another as opposed to coding languages used to produce software and web content.

    Test ID 11.A always applies.

    Check that:

    Test Name Test ID Test Condition / How to Test
    3.1.1-page-language-defined 11.A

    The default human language of each web page can be programmatically determined.

    Applicability:

    This Test Condition always applies – you may NOT evaluate the condition as DOES NOT APPLY (DNA). All pages should have some textual content, even if that content is included programmatically as alternative text for non-text content.

    How to Test:

    1. Launch ANDI: structures.

    2. Click the "more details" link, then “page language.”

      1. ANDI will display a dialog listing the value of the lang attribute assigned to the <html> element of the page.

      2. If no lang attribute is defined or if the attribute is empty, ANDI will provide a warning in the same dialog.

    3. Consult the Internet Assigned Numbers Authority's (IANA) Language subtag registry to determine whether the language is properly defined and matches the default human language for the page.

    Evaluate Results:

    If ALL of the following are TRUE, then the content PASSES:

    1. The default primary language is correctly specified per IANA, AND

    2. The identified language in the lang attribute correctly matches the default human language for the page.

    Note:

    Language of Parts

    Identify Content

    1. Identify any text content that differs from the default human language of the page including alternative text.

    2. Identify the human language of the text content that differs from the default human language of the page.

    Note:

    • Proper names, technical terms, words of indeterminate language, and words or phrases that have become part of the vernacular of the immediately surrounding text do not require a lang attribute different from the default language of the page and are not covered by this test.

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 11.B.

    Check that:

    Test Name Test ID Test Condition / How to Test
    3.1.2-part-language-defined 11.B

    The human language for any content segment that differs from the default human language of the page can be programmatically determined.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if all of the content in the page is in the same human language.

    How to Test:

    1. Launch ANDI: structures; click the “more details” link, then “[#] lang attributes”. If the [#] is zero, no content was marked with a lang attribute.

    2. Locate the markup added to the web page that identifies the element to which the attribute is applied and the language defined in the language attribute value (e.g., "en" for English).

      1. Mouseover or tab to the markup to reveal the beginning and end of the element.

      2. Determine whether the entire passage is enclosed within the element.

    3. Consult the Internet Assigned Numbers Authority's (IANA) Language subtag registry to determine whether the language is properly defined and matches the human language for the content segment.

    Evaluate Results:

    If ALL of the following are TRUE, then the content PASSES:

    1. The language for the content segment that differs from the primary default language of the page is correctly specified per IANA, AND

    2. The identified language in the lang attribute correctly matches the human language for the content segment

    Note:

    Applicable Standards

    Section 508/WCAG Success Criteria Baseline Requirements

    WCAG SC 3.1.1 Language of Page: The default human language of each Web page can be programmatically determined.

    WCAG SC 3.1.2 Language of Parts: The human language of each passage or phrase in the content can be programmatically determined except for proper names, technical terms, words of indeterminate language, and words or phrases that have become part of the vernacular of the immediately surrounding text.

    15. Language

    12. Page Titles, Frames, and iFrames

    Page Titles

    Identify Content

    All web pages.

    Test Conditions 12.A and 12.B always apply.

    Check that:

    Test Name Test ID Test Condition / How to Test
    2.4.2-one-page-title-defined 12.A

    One <title> element is defined for the web page.

    Applicability:

    This Test Condition always applies – you may NOT evaluate the condition as DOES NOT APPLY (DNA).

    How to Test:

    1. Launch ANDI: structures. Review the alerts in ANDI’s “Accessibility Alerts” section to determine whether ANDI displays any of the following Invalid HTML Alerts:

      1. “Page has more than one <title> tag”

      2. “Page has no <title>”

      3. “Page <title> cannot be empty”

    Evaluate Results:

    If the following is TRUE, then the Test Condition is TRUE and the content PASSES:

    1. One Page Title is defined for the web page.

    2.4.2-page-title-purpose 12.B

    The <title> element identifies the contents or purpose of the web page.

    Applicability:

    This Test Condition always applies – you may NOT evaluate the condition as DOES NOT APPLY (DNA).

    How to Test:

    1. Launch ANDI: structures, then select “more details”, then "page title."

      1. A modal dialog box will appear with the identified page title listed.

    2. Evaluate the purpose and content of the web page.

    3. Determine whether the Page Title is a meaningful representation or indication of page content.

      1. If the web page is part of a set of web pages, determine whether the Page Title is sufficient to distinguish the web page from other pages.

      2. For documents or web applications, the name of the document or web application would be sufficient to describe the purpose of the page.

    Evaluate Results:

    If ALL of the following are TRUE, then the Test Condition is TRUE and the content PASSES:

    1. The Page Title accurately identifies the contents or purpose of the web page, AND

    2. If the web page is part of a set of web pages, the Page Title accurately distinguishes the web page from other pages in the web site.

    Note:

    • A web application is an application that runs in a web browser (such as webmail) and may not have a URL that changes as content on the web page changes.

    Frames

    Identify Content

    Use ANDI to identify all Frames.

    1. Launch ANDI. If a Frame is used, ANDI will provide a notification that Frames have been detected.

      1. If there are no Frames, ANDI will provide no notification.

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 12.C.

    Check that:

    Test Name Test ID Test Condition / How to Test
    4.1.2-frame-title 12.C

    Each <frame> has a title attribute that describes its content.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if no Frames are identified.

    How to Test:

    1. Launch ANDI. If a Frame is used, ANDI will provide a notification that Frames have been detected. Select ‘Cancel’ to test an individual frame.

    2. ANDI will provide a list of each Frame, and if available, the associated title attribute.

    1. If there is no title attribute, ANDI will reveal an “alert”.

    1. Select each link to access each frame. Review the frame to understand its content. Use the browser’s Back button to return to the page being tested and launch ANDI again.

    2. In ANDI, review each frame’s corresponding title attribute for a meaningful description of content.

    Evaluate Results:

    If the following is TRUE, then the Test Condition is TRUE and the content PASSES:

    1. All frames have a title attribute that describes its content.

    Note:

    • In HTML5 the <frame> element is marked as obsolete. While the <frame> element has been deprecated in HTML5, testers may still encounter web pages and/or web applications with code that, while outdated, can and should still be accessible.

    • Frame content must be tested for conformance with all other applicable tests.

      • To open each Frame to test: Launch ANDI and select “Cancel”. A list of Frames will show; select the link to test an individual frame.

    iFrames

    Identify Content

    Use ANDI to identify all iframes in the tab order.

    1. Launch ANDI: iframes to navigate to and highlight iframes on the page. If there is not an option for the iframes module, there are no iframes on the web page.

    2. Use the “Next Element” button to find all iframes that have the following listed in the Accessibility Component:

    • a tabindex value that is not negative (e.g. 0, 1) meaning the iframe is in the tab order (a negative tabindex such as -1 means that the iframe is not in the tab order)

      OR

    • no tabindex shown.

      Test only these <iframes>.

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 12.D.

    Check that:

    Test Name Test ID Test Condition / How to Test
    4.1.2-iframe-name 12.D

    The combination of accessible name and description for each <iframe> describes its content.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if no iframes are identified or if none of the iframes are in the tab order (i.e., the tabindex is a negative number).

    How to Test:

    1. Launch ANDI: iframes

    2. Review the ANDI Output for each iframe that has a non-negative tabindex value to determine whether the accessible name and description accurately describe the content of each <iframe>.

    Evaluate Results:

    If the following is TRUE, then the content PASSES:

    1. The ANDI Output for each <iframe> in the tab order sufficiently describes its content.

    Note:

    • All iframe content must be tested for conformance with all other applicable tests. To open each iframe to test: Within ANDI: iframe, select the button to “test in new tab”

    Applicable Standards

    Section 508/WCAG Success Criteria Baseline Requirements
    WCAG SC 2.4.2 Page Titled: Web pages have titles that describe topic or purpose. 11. Page Titles
    WCAG SC 4.1.2 Name, Role, Value: For all user interface components (including but not limited to: form elements, links and components generated by scripts), the name and role can be programmatically determined; states, properties, and values that can be set by the user can be programmatically set; and notification of changes to these items is available to user agents, including assistive technologies. 19. Frames and iFrames

    13. Sensory Characteristics and Contrast

    Use of Color

    Identify Content

    Identify content that relies on color to convey meaning, such as indicate an action, prompt a response, or distinguish a visual element.

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 13.A.

    Check that:

    Test Name Test ID Test Condition / How to Test
    1.4.1-color-meaning 13.A

    Color is not used as the only visual means of conveying information, indicating an action, prompting a response, or distinguishing a visual element.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if no content relies on color to convey meaning.

    How to Test:

    1. Determine whether color is the only method used to convey information (e.g., review the onscreen text for a full description and/or look for other visual cues).

    Evaluate Results:

    If the following is TRUE, then the content PASSES:

    1. When color is used to convey information, indicate an action, prompt a response or distinguish a visual element, another visual, onscreen method is used to convey the information which does not use color.

    Note:

    • Alternate text that appears on mouse-over of a visual element is not considered to be “onscreen text.”

    • An error indicator cannot use color alone as an indicator.

    • It is considered a browser setting if a visited link changes color, and this is not failed for 13.A.

    Use of Sensory Characteristics

    Identify Content

    Identify instructions for understanding and operating content that use sensory information to convey information, e.g., references to shape, size, visual location, orientation, or sound.

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 13.B.

    Check that:

    Test Name Test ID Test Condition / How to Test
    1.3.3-sensory-info 13.B

    Instructions provided for understanding and operating content do not rely solely on sensory characteristics of components, such as shape, size, visual location, orientation, or sound.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if the page’s instructions do not rely on sensory information.

    How to Test:

    1. Determine if the instructions using sensory characteristics provide details that allow content to be located, identified, understood, and operated without any knowledge of its shape, size, orientation, or relative position.

    2. Check for any auditory cues that are provided for instructions or operating content.

      1. Ensure sound is not muted while testing for the presence of instructions and operating content.

      2. Determine whether other visual and/or textual cues are also provided.

    Evaluate Results:

    If the following is TRUE, then the Test Condition is TRUE and the content PASSES:

    1. When instructions use shape, size, location, orientation, or sound to convey meaning, another method that does not rely on sensory characteristics is provided.

    Note:

    • Part of testing 13.A for 1.4.1-color-meaning is to ensure color alone is not used to convey information. This includes if color alone is used to provide instructions or operating procedures. Neither can other sensory characteristics be used alone to provide instructions or operating procedures. The use of color can be used in combination with shape, size, visual location, orientation, or sound to meet this requirement. If instructions or operating procedures:

      • rely solely on color, then the content FAILS for 13.A for 1.4.1-color-meaning.

      • rely solely on other sensory information (such as shape, size, visual location, orientation, or sound), then the content FAILS for 13.B for 1.3.3-sensory-info.

    Color Contrast

    Identify Content

    Identify ALL text AND images of text.

    EXCLUDE text that is:

    • In logotypes: logo or brand name

    • For inactive (disabled) user interface components

    • For purely decoration purposes and not meaningful, i.e., having no functionality

    • Contained within a picture that contains significant other visual content

    Note:

    • Some text may not initially be visible on the page, including text that becomes visible on mouseover or when an element receives focus. Nevertheless, the text must still conform to the color contrast requirement, wherever it occurs.

    • Visited text links must also provide sufficient contrast.

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 13.C.

    Check that:

    Test Name Test ID Test Condition / How to Test
    1.4.3-contrast 13.C

    The visual presentation of text and images of text have sufficient contrast.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if the page has no visible text or images of text.

    How to Test:

    1. Launch ANDI: color contrast.

    2. Review any “Contrast Alerts” in ANDI’s “Accessibility Alerts” section to identify any text that fails to meet the minimum contrast ratio.

    3. In ANDI’s “Accessibility Alerts” section, identify any “Manual Contrast Tests Needed.”

      1. If the text is not selectable or appears on a background image, determine the contrast using the Colour Contrast Analyser.

      2. Open the Colour Contrast Analyser tool, select the Foreground color-dropper button, and click a pixel in the text font.

      3. Select the Background color-dropper button, and click a pixel in the background close to the text. If the background is varied in appearance or color, choose a pixel that provides the least contrast.

      4. Identify the Contrast Ratio

      5. Compare the contrast ratio against the minimum required contrast ratio identified in the ANDI Contrast Ratio output.

    4. If the page contains an image of text alone, or an image with text and no other significant content, test the image of the text with the CCA to determine the contrast ratio between the foreground (text) and background.
      Note: This is for instances where it is impossible for the ANDI: color contrast module to detect the presence of the text.

      1. Select the graphics/images module in ANDI.

      2. Use the ANDI arrow buttons to identify any images of text or images with text and no other significant content.

      3. Open the CCA and test the contrast of the text in the image.

        1. Select the Foreground color-dropper button and click a pixel in the text font.

        2. Select the Background color-dropper button and click a pixel in the background close to the text. If the background is varied in appearance or color, choose a pixel that provides the least contrast.

        3. Determine whether the resulting contrast ratio is at least 4.5:1.

    Evaluate Results:

    If any of the following is TRUE, then the Test Condition is TRUE and the content PASSES:

    1. The contrast between the text and its background is equal to or greater than the minimum required contrast ratio identified in the ANDI Contrast Ratio output, OR

    2. If the text is an image of text, the contrast between the image of text and its background is equal to or greater than 4.5:1 as identified using the Colour Contrast Analyser.

    Applicable Standards

    Section 508/WCAG Success Criteria Baseline Requirements

    WCAG SC 1.4.1 Use of Color: Color is not used as the only visual means of conveying information, indicating an action, prompting a response, or distinguishing a visual element.

    WCAG SC 1.3.3 Sensory Characteristics: Instructions provided for understanding and operating content do not rely solely on sensory characteristics of components such as shape, size, visual location, orientation, or sound.

    WCAG SC 1.4.3 Contrast (minimum): The visual presentation of text and images of text has a contrast ratio of at least 4.5:1, except for the following:

    • Large Text: Large-scale text and images of large-scale text have a contrast ratio of at least 3:1;

    • Incidental: Text or images of text that are part of an inactive user interface component, that are pure decoration, that are not visible to anyone, or that are part of a picture that contains significant other visual content, have no contrast requirement.

    • Logotypes: Text that is part of a logo or brand name has no minimum contrast requirement.

    7. Sensory Characteristics

    14. Tables

    Data Tables

    Identify Content

    Identify all data tables (including images of data tables) where data cell(s) require header(s) for understanding.

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 14.A and 14.B.

    Note:

    • To assist with identification of data tables, use ANDI:

      1. Launch ANDI: structures, then select the “reading order” link.

      2. The markup added to the page indicates the order that assistive technology will read the content

    • Data tables are those tables where information in a cell requires a row or column header to adequately describe the cell's contents. The reading order of a data table will not be sensible when read without the row and/or column headers. The content in a data table would be best understood when it is NOT read in the order that ANDI marks on the page.

    • EXCLUDE content that does not require a row or column header for understanding:

      • Layout tables are used for placement of components on the page for visual aesthetics without an informational relationship between headers and the information in the data cells. Content is understandable when read in the marked reading order.

      • Where content that is visually presented in a table but is understandable when read in the marked reading order, the content does not require a data table structure. This may occur when a CSS technique has been used to visually present information

    Check that:

    Test Name Test ID Test Condition / How to Test
    1.3.1-table-identification 14.A

    Each data table has programmatic markup to identify it as a table.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if there are no data tables on the page.

    How to Test:

    1. Launch ANDI: tables.

      1. Determine whether ANDI detects and identifies the data table(s).

        1. If the tables module does not display as an option in the modules selection list, then ANDI has not detected any table programmatically on the page (meaning any content presented visually in a table does not have programmatic markup to identify it as a table).

        2. If the ANDI: tables module is available, use ANDI’s “Analyze Next Table” button to sequentially highlight the detected tables on the page. If the table in question is not outlined by ANDI and/or it is not possible to navigate to the table using ANDI, then ANDI has not detected that table programmatically (meaning the content presented visually in that table does not have programmatic markup to identify it as a table).

    2. Review any data tables that use role=”presentation”.

    1. ANDI will display role=”presentation” in the Element information and/or under Accessibility Alerts.

    2. A data table that includes role=”presentation” will not convey the table semantics to a screen reader and would fail this test.

    1. ANDI Output will display an alert whenever ARIA role=”table” is not coded correctly.

    2. Use the Analyze Previous/Next Table buttons in ANDI to navigate to each of the identified tables on the page.

    Evaluate Results:

    If ALL of the following are TRUE, then the content PASSES:

    1. It is possible to navigate in ANDI: tables to each data table using the ANDI Analyze Previous/Next Table buttons, AND

    2. The data table DOES NOT have an ARIA role=”presentation” assigned, AND

    3. The data table DOES NOT have any ANDI Table alerts for incorrect use of ARIA table attributes.

    1.3.1-cell-header-association 14.B

    All data cells are programmatically associated with relevant headers.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if there are no data tables on the page.

    How to Test:

    1. Continue from Test 14.A.

    2. Navigate to each data cell with ANDI: tables.

    3. Inspect the ANDI Output for each data cell and/or inspect the visual highlighting of the data table to determine whether the table identifies all relevant headers for each data cell.

    Evaluate Results:

    If the following is TRUE, then the Test Condition is TRUE and the content PASSES:

    1. The data table appropriately identifies header relationships for each data cell.

    Note:

    • Any changes to data tables that occur automatically or as a result of interaction with the page should be included in this test.

    Layout Tables

    Identify Content

    Identify any programmatic tables where the table structure is used purely for layout purposes.

    EXCLUDE data tables.

    Note:

    • To find programmatic tables on the page, use ANDI: tables (as in the previous test).

      1. If the ANDI: tables module does not display as an option in the modules selection list, then there is no programmatic table on the page.

    • To assist with identifying layout tables:

      1. Launch ANDI: structures, then select the “reading order” link.

      2. The markup added to the page indicates the order that assistive technology will read the content

    • Content that is within a layout table does not require row or column headers for understanding. This content should be sensible when read in the reading order identified by ANDI.

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 14.C.

    Check that:

    Test Name Test ID Test Condition / How to Test
    1.3.1-layout-table-structure 14.C

    The layout table DOES NOT designate the layout table using ARIA role=”table” AND DOES NOT include table header structure and relationship elements and/or associated attributes.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if there are no layout tables on the page.

    How to Test:

    1. Continue from Test 14.A.

      1. If the ANDI: tables module does not display as an option in the modules selection list, then ANDI has not detected a table programmatically on the page, and there is no content on the page that has programmatic markup to identify it as a table.

      2. If the ANDI: tables module is available, use ANDI’s “Analyze Next Table” button to outline the detected tables on the page. If the table in question is not outlined by ANDI and/or it is not possible to navigate to the table using ANDI, then ANDI has not detected that table programmatically (meaning the content presented visually in that table does not have programmatic markup to identify it as a table).

    2. Inspect the “Element” output in ANDI to determine whether the layout uses role=”table”.

    3. Inspect the ANDI output and any associated alerts to determine whether a <table> includes header structure elements and/or attributes (e.g., <th>, scope=”row”).

      1. If a table has an ARIA role=”presentation” assigned and the table also denotes header relationships (e.g., using <th>, scope=”row”) ANDI will provide a corresponding alert; ignore this alert on a layout table. A table that includes role="presentation" will not convey the table semantics to a screen reader. Therefore, the table structure semantics (e.g., <th>, scope=”row”) can be ignored if the table is indeed a layout table.

    Evaluate Results:

    If any of the following is TRUE, then the Test Condition is TRUE and the content PASSES:

    1. ANDI DOES NOT detect the layout as a table (i.e., the layout does not identify the layout programmatically as a table), OR

    2. The <table> element includes the attribute role=”presentation,” OR

    3. BOTH of the following are TRUE:

      1. The layout DOES NOT use role=”table” or any associated ARIA table attributes (e.g., role=”row”, role=”columnheader”), AND

      2. The layout DOES NOT include table structure and relationship elements or associated attributes (e.g., <th>, scope=”row”)

    Applicable Standards

    Section 508/WCAG Success Criteria Baseline Requirements

    WCAG SC 1.3.1 Info and Relationships: Information, structure, and relationships conveyed through presentation can be programmatically determined.

    WCAG SC 1.3.2 Meaningful Sequence: When the sequence in which content is presented affects its meaning, a correct reading sequence can be programmatically determined.

    WCAG SC 4.1.2 Name, Role, Value: For all user interface components (including but not limited to: form elements, links and components generated by scripts), the name and role can be programmatically determined; states, properties, and values that can be set by the user can be programmatically set; and notification of changes to these items is available to user agents, including assistive technologies.

    12. Tables

    15. CSS Content and Positioning

    CSS Content

    Identify Content

    Use ANDI to identify all meaningful page content inserted by CSS using ::before or ::after. Include inline styles.

    1. Launch ANDI: hidden content.

    2. Determine whether content ::before ::after is revealed as an option.

    1. If there is no page content inserted using ::before or ::after, this option will not show.

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 15.A.

    Check that:

    Test Name Test ID Test Condition / How to Test
    1.3.1-meaningful-content-css-before-after 15.A

    For the meaningful content provided via CSS pseudo-elements ::before and ::after, equivalent information is available in another way.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if there is no content OR no meaningful content inserted using ::before and ::after.

    How to Test:

    1. Launch ANDI: hidden content and select content ::before ::after to reveal all content.

      1. Content inserted using ::before ::after will be outlined in red.

    2. Review all content inserted using ::before and :after and find all meaningful content.

      1. If there is no meaningful content, mark 15.A DNA.

      2. If meaningful content is found, continue to the next step.

    3. Review the web page and determine whether for every instance of CSS content highlighted as using ::before ::after, the information conveyed by the CSS content is available another way.

    Evaluate Results:

    If the following is TRUE, then the Test Condition is TRUE and the content PASSES:

    1. All meaningful content inserted using ::before and ::after has equivalent information presented in another way.

    CSS Positioning

    Identify Content

    Use ANDI to identify all content positioned with CSS and inline styles.

    1. Launch ANDI, then select the Advanced Settings button; then select “linearize page.”

    2. If content is positioned with CSS, the information will be displayed with blue highlighting around the elements and those elements will be placed in the page in the same order in which they appear in the page code.

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 15.B. and 15.C.

    Check that:

    Test Name Test ID Test Condition / How to Test
    1.3.2-content-order-CSS-position 15.B

    The reading order of the content (in context) is correct without CSS positioning.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if there is no content positioned using CSS.

    How to Test:

    1. Review all highlighted, linearized content.

    2. Determine whether the reading order of content is still understandable after linearization.

    Evaluate Results:

    If the following is TRUE, then the content PASSES:

    1. The reading order of the content (in context) is correct without CSS positioning.

    Note:

    • ANDI: structures, and the “reading order” link can also be used to reveal the reading order prior to linearizing the content. But, it will not identify which content has been positioned with CSS.

    1.3.2-content-meaning-CSS-position 15.C

    The meaning of the content (in context) is preserved without CSS positioning.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if there is no content positioned using CSS.

    How to Test:

    1. Continue from Test 15.B.

    2. Review the content highlighted with linearization markup.

    3. Determine whether the linearized position of the content preserves the meaning of the content (in context) when in its original position on the page. If necessary, toggle the linearization button to view the original position of the content.

    Evaluate Results:

    If the following is TRUE, then the Test Condition is TRUE and the content PASSES:

    1. The meaning of content (in context) does not rely on CSS positioning of that content.

    Applicable Standards

    Section 508/WCAG Success Criteria Baseline Requirements

    WCAG SC 1.3.1 Info and Relationships: Information, structure, and relationships conveyed through presentation can be programmatically determined or are available in text.

    WCAG SC 1.3.2 Meaningful Sequence: When the sequence in which content is presented affects its meaning, a correct reading sequence can be programmatically determined.

    18. CSS Content and Positioning

    16. Pre-Recorded Audio-Only, Video-Only, and Animations

    Pre-Recorded Audio Only

    Identify Content

    Identify all pre-recorded audio-only content when activated.

    EXCLUDE any audio-only content that is:

    • Clearly labeled as a media alternative for text, OR

    • Consists of short sounds used to notify the user, such as confirmation beeps and error notifications.

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 16.A.

    Note:

    • If audio is synchronized with video, slides, animations, or other time-based visual media, use the synchronized media tests in Section 17.

    Check that:

    Test Name Test ID Test Condition / How to Test
    1.2.1-audio-transcript-text 16.A

    A text-based alternative is provided for audio-only content that provides an accurate and complete representation of the audio-only content.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if there is no pre-recorded audio-only content.

    How to Test:

    1. Determine if, for each audio-only content, a transcript is provided.

      1. Determine whether each transcript is text, (i.e., an image of a transcript would not be sufficient to pass this test).

    2. Play the audio-only content entirely while reviewing the transcript.

    3. Determine whether the information in the transcript is an accurate, correctly sequenced, and complete representation of the audio-only content.

      1. To be a complete representation of the content, the transcript must also describe relevant sounds in addition to dialogue, such as doors banging, sirens wailing, identification of speakers in dialogue, etc.

    Evaluate Results:

    If ALL of the following are TRUE, then the content PASSES:

    1. A text-based transcript is provided for all audio-only content, AND

    2. The transcript is an accurate and complete representation of the audio-only content.

    Pre-recorded Video

    Identify Content

    Identify all pre-recorded video-only content.

    • In a video-only presentation, information is presented in a variety of ways including animation, text or graphics, the setting and background, the actions and expressions of people, animals, etc.

    EXCLUDE any video-only intended as a media alternative for text if it is clearly labeled as such.

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 16.B.

    Note:

    • If the video is accompanied by timed sounds or meaningful dialog, it is not video-only; use the synchronized media tests in section 17.

    Check that:

    Test Name Test ID Test Condition / How to Test
    1.2.1-video- alternative-equivalent 16.B

    The video-only content information is also available through an equivalent text or audio alternative.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if there is no pre-recorded video-only content.

    How to Test:

    1. Determine whether a text or audio alternative is provided for all video-only content (such as a transcript in text that provides a description of video content and actions or an audio track).

    Note: An image of a transcript does not meet this requirement.

    1. Play the video-only content entirely while reviewing the text or audio alternative.

    2. Determine whether the information in the text or audio alternative includes the same information that the video-only presentation displays (e.g., if the video includes multiple characters, the alternative must identify which character is associated with each depicted action).

    Evaluate Results:

    If ALL of the following are TRUE, then the content PASSES:

    1. A text or audio alternative is provided for all video-only content, AND

    2. The text or audio alternative is an accurate and complete representation of the video-only content.

    Applicable Standards

    Section 508/WCAG Success Criteria Baseline Requirements

    WCAG SC 1.2.1 Audio-Only and Video-Only (Prerecorded): For prerecorded audio-only and prerecorded video-only media, the following are true, except when the audio or video is a media alternative for text and is clearly labeled as such:

    • Prerecorded Audio-only: An alternative for time-based media is provided that presents equivalent information for prerecorded audio-only content.

    • Prerecorded Video-only: Either an alternative for time-based media or an audio track is provided that presents equivalent information for prerecorded video-only content.

    16. Audio-Only and Video-Only

    17. Synchronized Media

    Pre-Recorded Synchronized Media

    Identify Content

    Identify any pre-recorded synchronized multimedia content.

    EXCLUDE media that is clearly identified as a media alternative for text.

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 17A and 17.B.

    Check that:

    Test Name Test ID Test Condition / How to Test
    1.2.2-captions-equivalent 17.A

    The multimedia provides accurate captions for the audio content.

    Applicability

    This Test Condition DOES NOT APPLY (DNA) if there is no pre-recorded synchronized media.

    How to Test:

    1. Enable captions through the multimedia player functions and play the media.

      1. A separate media file with captions may be provided to meet this requirement (i.e. captioned media version is a different file). If provided, test that one.

    2. Listen to the audio of the entire synchronized media. Compare the audio to the captions for accuracy, time-synchronization, and equivalence.

      1. Captions should include all dialogue and equivalents for non-dialogue audio information needed to understand the program content, including sound effects, music, laughter, speaker identification and location.

      2. The definition of captions includes synchronization. If they are not synchronized, they are not considered captions.

    Evaluate Results:

    If ALL of the following is TRUE, then the Test Condition is TRUE and the content PASSES:

    1. Captions are provided for all multimedia content, AND

    2. Captions are accurate and include all dialogue and equivalents for non-dialogue audio information needed to understand the program content, including sound effects, e.g., music, laughter, speaker identification and location, AND

    3. All other relevant information in the video is clearly visible (not obstructed by captions) when captions are enabled.

    Note:

    • Transcripts and non-synchronized alternatives alone will not meet this requirement.

    1.2.5-audio-description-equivalent 17.B

    The multimedia provides an equivalent soundtrack (combination of narration and audio descriptions) for the video content.

    Applicability

    This Test Condition DOES NOT APPLY (DNA) if there is no pre-recorded synchronized media.

    How to Test:

    1. Enable audio descriptions through multimedia player and play the media.

      1. Audio descriptions are narration added to or combined with the soundtrack to describe important visual details that cannot be understood from the main soundtrack alone.

      2. A separate media file with audio description may be provided to meet this requirement (i.e. audio description media file is a different file). If provided, test that one.

    2. Identify visual content that requires narrative descriptions.

    3. Determine whether the main soundtrack combined with audio descriptions adequately describe important visual details (actions, characters, scene changes, onscreen text, etc.) for a viewer who is unable to see the content.

      1. If the primary audio adequately describes important visual content in the media, including information about actions, characters, scene changes, onscreen text, speaker identification and location, and other visual content, additional audio description is not necessary.

    4. Compare the video to the combined soundtrack and review the soundtrack for accuracy, time-synchronization, and equivalence.

      1. Audio descriptions are inserted in pauses in dialog. Synchronization may not be possible, but the description should be provided as timely as possible so meaning is preserved.

    Evaluate Results:

    If the following is TRUE, then the content PASSES:

    1. The soundtrack (combination of audio descriptions and narration) adequately describes important visual content in the media, including information about actions, characters, scene changes, onscreen text, and other visual content.

    Note:

    • Transcripts and non-synchronized alternatives alone will not meet this requirement.

    Live Synchronized Media

    Identify Content

    Identify any live synchronized multimedia content. These requirements are only intended for broadcast of synchronized media.

    EXCLUDE two-way multimedia calls between two or more individuals through web apps.

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 17.C.

    Check that:

    Test Name Test ID Test Condition / How to Test
    1.2.4-captions-live-equivalent 17.C

    The live multimedia provides accurate captions for the audio content.

    Applicability

    This Test Condition DOES NOT APPLY (DNA) if there is no live synchronized media.

    How to Test:

    1. Enable captions through multimedia player functions.

    2. Listen to the audio of the synchronized media. Compare the audio to the captions for accuracy, time-synchronization, and equivalence.

      1. Lower accuracy of captions for live broadcasts may be acceptable due to limitations of real-time caption capabilities.

      2. The definition of captions includes synchronization. If they are not synchronized, they are not considered captions.

    Evaluate Results:

    If ALL of the following are TRUE, then the content PASSES:

    1. Captions are provided for all live multimedia, AND

    2. All captions are accurate, AND

    3. Any discrepancies between the captions and the audio output are minor in nature and do not significantly impact understanding (applicable to live captioning only).

    Media Player Controls

    Identify Content

    Identify any media player used to display synchronized video and audio content.

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 17.D to 17.F.

    Check that:

    Test Name Test ID Test Condition / How to Test
    503.4-caption-description-controls 17.D

    The media player provides user controls for closed captions and audio descriptions.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if there is no media player.

    How to Test:

    1. Locate the controls for selection of closed captions and audio descriptions.

    Evaluate Results:

    If the following is TRUE, then the content PASSES:

    1. The media player provides user controls for closed captions AND audio descriptions.

    Media Player – Caption Controls at Volume Menu Level

    Identify Content

    Identify any media player with volume adjustment control(s).

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 17.E.

    Check that:

    Test Name Test ID Test Condition / How to Test
    503.4.1-caption-control 17.E

    User controls for captions are provided at the same menu level as the user controls for volume or program selection.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if there is no media player or the media player does not have a volume adjustment control.

    How to Test:

    1. Continue from Test 17.D.

    2. Locate the user controls for volume selection.

    Evaluate Results:

    If the following is TRUE, then the content PASSES:

    1. The user controls for captions are provided at the same menu level as the volume controls or program selection controls.

    Media Player – AD Controls at Program Menu Level

    Identify Content

    Identify any media player with program selection control(s).

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 17.F.

    Check that:

    Test Name Test ID Test Condition / How to Test
    503.4.2-description-control 17.F

    User controls for audio descriptions are provided at the same menu level as the user controls for volume or program selection.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if there is no media player or the media player does not have a program selection control.

    How to Test:

    1. Continue from Test 17.D.

    2. Locate the user controls for program selection.

    Evaluate Results:

    If the following is TRUE, then the content PASSES:

    1. The user controls for audio descriptions are at the same menu level as program selection controls or volume controls.

    Applicable Standards

    Section 508/WCAG Success Criteria Baseline Requirements

    WCAG SC 1.2.2 Captions (Prerecorded): Captions are provided for all prerecorded audio content in synchronized media, except when the media is a media alternative for text and is clearly labeled as such.

    WCAG SC 1.2.3 Audio Description or Media Alternative (Prerecorded): An alternative for time-based media or audio description of the prerecorded video content is provided for synchronized media, except when the media is a media alternative for text and is clearly labeled as such.

    WCAG SC 1.2.4 Captions (Live): Captions are provided for all live audio content in synchronized media.

    WCAG SC 1.2.5 Audio Description (Prerecorded): Audio description is provided for all prerecorded video content in synchronized media.

    Section 508 503.4.1 Caption Controls: Where user controls are provided for volume adjustment, ICT shall provide user controls for the selection of captions at the same menu level as the user controls for volume or program selection.

    Section 508 503.4.2 Audio Description Controls: Where user controls are provided for program selection, ICT shall provide user controls for the selection of audio descriptions at the same menu level as the user controls for volume or program selection.

    17. Synchronized Media

    18. Resize Text

    Textual Content

    Identify Content

    All text on a page

    EXCLUDE captions for synchronized media and images of text.

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 18.A.

    Check that:

    Test Name Test ID Test Condition / How to Test
    1.4.4-resize-text 18.A

    There is a mechanism to resize, scale, or zoom in on the text to at least 200% of its original size without loss of content or functionality.

    Applicability

    This Test Condition DOES NOT APPLY (DNA) if there is no text on the page.

    How to Test:

    1. Use built-in browser zoom functions to resize the text to at least 200%.

    2. If any of the content did not zoom using the built-in browser functions, determine whether there is a non-AT mechanism to resize page content to 200% of its original size, e.g., Operating System, platform, or other mechanism provided directly by the web page/application.

    Evaluate Results:

    If ALL of the following are TRUE, then the content PASSES:

    1. There is a non-AT-reliant mechanism that allows the user to resize text to at least 200% of its original size, AND

    2. Text is not clipped, truncated or obscured, AND

    3. All functionality is available, AND

    4. All content is available.

    Applicable Standards

    Section 508/WCAG Success Criteria Baseline Requirements
    WCAG SC 1.4.4 Resize Text: Except for captions and images of text, text can be resized without assistive technology up to 200 percent without loss of content or functionality. 22. Resize Text

    19. Multiple Ways

    Web Page Access

    Identify Content

    All web pages within a set of related web pages.

    EXCLUDE web pages that are the result of, or a step in, a process, such as an order confirmation form.

    If there is no such content, the result for the following test ID(s) is DOES NOT APPLY: 19.A.

    Check that:

    Test Name Test ID Test Condition / How to Test
    2.4.5-multiple-ways 19.A

    There are two or more ways to locate a web page within a set of web pages.

    Applicability:

    This Test Condition DOES NOT APPLY (DNA) if the web page is not within a set of related web pages OR the web page is a result of, or a step in, a process.

    How to Test:

    1. Determine whether there are two or more ways to locate the specific web page within a set of web pages; these may include (but are not limited to) techniques such as:

    1. site maps

    2. site search

    3. tables of contents

    4. navigation menus or dropdowns

    5. navigation trees

    6. links between pages

    Note: Additional techniques for locating a web page may be available beyond those listed in the test instructions.

    1. Verify that the identified techniques correctly function and lead to the web page within the site, for example:

    1. Links/menus lead to the corresponding pages of the site.

    2. The search form leads to the page(s) which contains the search term.

    Evaluate Results:

    If ALL of the following are TRUE, then the content PASSES:

    1. At least two techniques exist to locate the web page within the site, AND

    2. The techniques function correctly such that they lead to the correct web page.

    Applicable Standards

    Section 508/WCAG Success Criteria Baseline Requirements
    WCAG SC 2.4.5 Multiple Ways: More than one way is available to locate a Web page within a set of Web pages except where the Web Page is the result of, or a step in, a process. 23. Multiple Ways

    20. Parsing

    The result for test 20.A (4.1.1-parsing) should be recorded as NOT TESTED.

    Note:

    • Multiple requirements are specified for the Parsing requirement. To determine if requirements are met, a testing tool would be very helpful but is not available at this time. The test process will be updated when a testing tool is identified. Until then, the test result should be “Not Tested”.

    Applicable Standards

    Section 508/WCAG Success Criteria

    Baseline Requirements

    WCAG SC 4.1.1 Parsing: In content implemented using markup languages, elements have complete start and end tags, elements are nested according to their specifications, elements do not contain duplicate attributes, and any IDs are unique, except where the specifications allow these features. 24. Parsing

    Appendix A: Test Process Mapping

    Test to Section 508/WCAG Requirement and Baseline Test (cross-reference table)

    Test ID / Test Name Section 508 / WCAG Requirement Baseline Test
    1.A / Alt-version-conformant Con.1 Conformance Requirement 1. Conformance Level 20. Alternate Versions
    1.B / Alt-version-equivalent Con.1 Conformance Requirement 1. Conformance Level 20. Alternate Versions
    1.C / Alt-version-access Con.1 Conformance Requirement 1. Conformance Level 20. Alternate Versions
    1.D / Alt-version-nc-access Con.1 Conformance Requirement 1. Conformance Level 20. Alternate Versions
    1.E / non-interference Con.5 Conformance Requirement 5. Non-Interference 25. Non-Interference
    2.A / 1.4.2-audio-control 1.4.2 Audio Control 21. Timed Events
    2.B / 2.2.2-blinking-moving-scrolling 2.2.2 Pause, Stop, Hide 21. Timed Events
    2.C / 2.2.2-auto-updating 2.2.2 Pause, Stop, Hide 21. Timed Events
    2.D / 4.1.2-change-notify-auto 4.1.2 Name, Role, Value 21. Timed Events
    20.A / 4.1.1-parsing 4.1.1 Parsing 24. Parsing
    3.A / 2.3.1-flashing 2.3.1 Three Flashes or Below Threshold 9. Flashing
    4.A / 2.1.1-keyboard-access 2.1.1 Keyboard 1. Keyboard Access
    4.B / 2.1.1-no-keystroke-timing 2.1.1 Keyboard 1. Keyboard Access
    4.C / 2.1.2-no-keyboard-trap 2.1.2 No Keyboard Trap 1. Keyboard Access
    4.D / 2.1.1-title-keyboard-access 2.1.1 Keyboard 2. Focus Visible
    4.E / 2.4.7-focus-visible 2.4.7 Focus Visible 2. Focus Visible
    4.F / 3.2.1-on-focus 3.2.1 On Focus 3. Focus Order
    4.G / 2.4.3-focus-order-meaning 2.4.3 Focus Order 3. Focus Order
    4.H / 2.4.3-focus-order-reveal 2.4.3 Focus Order 3. Focus Order
    4.I / 2.4.3-focus-order-return 2.4.3 Focus Order 3. Focus Order
    5.A / 3.3.2-input-instructions 3.3.2 Labels or Instructions 10. Forms
    5.B / 1.3.1-form-labels-cues

    1.3.1 Info and Relationships

    4.1.2 Name, Role, Value

    10. Forms
    5.C / 3.2.2-on-input 3.2.2 On Input 10. Forms
    5.D / 4.1.2-change-notify-form 4.1.2 Name, Role, Value 10. Forms
    5.E / 3.3.1-error-identification 3.3.1 Error Identification 10. Forms
    5.F / 3.3.3-error-suggestion 3.3.3 Error Suggestion 10. Forms
    5.G / 3.3.4-error-prevention 3.3.4 Error Prevention (Legal, Financial, Data) 10. Forms
    6.A / 2.4.4-link-purpose

    2.4.4 Link Purpose (In Context)

    4.1.2 Name, Role, Value

    14. Links
    6.B / 4.1.2-change-notify-links 4.1.2 Name, Role, Value 14. Links
    7.A / 1.1.1-meaningful-image-name

    1.1.1 Non-text Content

    4.1.2 Name, Role, Value

    6. Images
    7.B / 1.1.1-decorative-image

    1.1.1 Non-text Content

    4.1.2 Name, Role, Value

    6. Images
    7.C / 1.1.1- decorative-background-image

    1.1.1 Non-text Content

    4.1.2 Name, Role, Value

    6. Images
    7.D / 1.1.1-captcha-alternative

    1.1.1 Non-text Content

    4.1.2 Name, Role, Value

    6. Images
    7.E / 1.4.5-image-of-text 1.4.5 Images of Text 6. Images
    8.A / 2.2.1-timing-adjustable 2.2.1 Timing Adjustable 21. Timed Events
    9.A / 2.4.1-bypass-function 2.4.1 Bypass Blocks 4. Repetitive Content
    9.B / 3.2.3-consistent- navigation 3.2.3 Consistent Navigation 4. Repetitive Content
    9.C / 3.2.4-consistent-identification 3.2.4 Consistent Identification 4. Repetitive Content
    10.A / 2.4.6-heading-purpose 2.4.6 Headings and Labels 13. Content Structure
    10.B / 1.3.1-heading-determinable 1.3.1 Info and Relationships 13. Content Structure
    10.C / 1.3.1-heading-level 1.3.1 Info and Relationships 13. Content Structure
    10.D / 1.3.1-list-type 1.3.1 Info and Relationships 13. Content Structure
    11.A / 3.1.1-page-language-defined 3.1.1 Language of Page 15. Language
    11.B / 3.1.2-part-language-defined 3.1.2 Language of Parts 15. Language
    12.A / 2.4.2-one-page-title-defined 2.4.2 Page Titled 11. Page Titles
    12.B / 2.4.2-page-title-purpose 2.4.2 Page Titled 11. Page Titles
    12.C / 4.1.2-frame-title 4.1.2 Name, Role, Value 19. Frames and iFrames
    12.D / 4.1.2-iframe-name 4.1.2 Name, Role, Value 19. Frames and iFrames
    13.A / 1.4.1-color-meaning 1.4.1 Use of Color 7. Sensory Characteristics
    13.B / 1.3.3-sensory-info 1.3.3 Sensory Characteristics 7. Sensory Characteristics
    13.C / 1.4.3-contrast 1.4.3 Contrast (Minimum) 8. Contrast
    14.A / 1.3.1-table-identification 1.3.1 Info and Relationships 12. Tables
    14.B / 1.3.1-cell-header-association 1.3.1 Info and Relationships 12. Tables
    14.C / 1.3.1-layout-table-structure 1.3.1 Info and Relationships 12. Tables
    15.A / 1.3.1-meaningful-content-css-before-after 1.3.1 Info and Relationships 18. Stylesheet Non-dependence
    15.B / 1.3.2-content-order-CSS-position 1.3.2 Meaningful Sequence 18. Stylesheet Non-dependence
    15.C / 1.3.2-content-meaning-CSS-position 1.3.2 Meaningful Sequence 18. Stylesheet Non-dependence
    16.A / 1.2.1-audio-transcript-text 1.2.1 Audio-only and Video-only 16. Audio-Only and Video-Only
    16.B / 1.2.1-video- alternative-equivalent 1.2.1 Audio-only and Video-only 16. Audio-Only and Video-Only
    17.A / 1.2.2-captions-equivalent 1.2.2 Captions (Prerecorded) 17. Synchronized Media
    17.B / 1.2.5-audio-description-equivalent 1.2.5 Audio Description (Prerecorded) 17. Synchronized Media
    17.C / 1.2.4-captions-live-equivalent 1.2.4 Captions (Live) 17. Synchronized Media
    17.D / 503.4-caption-description-controls 503.4 User Controls for Captions and Audio Description 17. Synchronized Media
    17.E / 503.4.1-caption-control 503.4.1 Caption Controls 17. Synchronized Media
    17.F / 503.4.2-description-control 503.4.2 Audio Description Controls 17. Synchronized Media
    18.A / 1.4.4-resize-text 1.4.4 Resize text 22. Resize Text
    19.A / 2.4.5-multiple-ways 2.4.5 Multiple Ways 23. Multiple Ways

    Section 508/WCAG Requirement to Trusted Tester Test and Baseline Test (cross-reference table)

    Section 508 / WCAG Requirement Test ID / Test Name Baseline Test
    1.1.1 Non-text Content

    7.A / 1.1.1-meaningful-image-name

    7.B / 1.1.1-decorative-image

    7.C / 1.1.1- decorative-background-image

    7.D / 1.1.1-captcha-alternative

    6. Images
    1.2.1 Audio-only and Video-only

    16.A / 1.2.1-audio-transcript-text

    16.B / 1.2.1-video- alternative-equivalent

    16. Audio-Only and Video-Only
    1.2.2 Captions (Prerecorded) 17.A / 1.2.2-captions-equivalent 17. Synchronized Media
    1.2.4 Captions (Live) 17.C / 1.2.4-captions-live-equivalent 17. Synchronized Media
    1.2.5 Audio Description (Prerecorded) 17.B / 1.2.5-audio-description-equivalent 17. Synchronized Media
    1.3.1 Info and Relationships

    10.B / 1.3.1-heading-determinable

    10.C / 1.3.1-heading-level

    10.D / 1.3.1-list-type

    14.A / 1.3.1-table-identification

    14.B / 1.3.1-cell-header-association

    14.C / 1.3.1-layout-table-structure

    15.A / 1.3.1-meaningful-content-css-before-after

    5.B / 1.3.1-form-labels-cues

    13. Content Structure

    12. Tables

    18. Stylesheet Non-dependence

    1.3.2 Meaningful Sequence

    15.B / 1.3.2-content-order-CSS-position

    15.C / 1.3.2-content-meaning-CSS-position

    18. Stylesheet Non-dependence
    1.3.3 Sensory Characteristics 13.B / 1.3.3-sensory-info 7. Sensory Characteristics
    1.4.1 Use of Color 13.A / 1.4.1-color-meaning 7. Sensory Characteristics
    1.4.2 Audio Control 2.A / 1.4.2-audio-control 21. Timed Events
    1.4.3 Contrast (Minimum) 13.C / 1.4.3-contrast 8. Contrast
    1.4.4 Resize text 18.A / 1.4.4-resize-text 22. Resize Text
    1.4.5 Images of Text 7.E / 1.4.5-image-of-text 6. Images
    2.1.1 Keyboard

    4.A / 2.1.1-keyboard-access

    4.B / 2.1.1-no-keystroke-timing

    4.D / 2.1.1-title-keyboard-access

    1. Keyboard Access

    2. Focus Visible

    2.1.2 No Keyboard Trap 4.C / 2.1.2-no-keyboard-trap 1. Keyboard Access
    2.2.1 Timing Adjustable 8.A / 2.2.1-timing-adjustable 21. Timed Events
    2.2.2 Pause, Stop, Hide 2.B / 2.2.2-blinking-moving-scrolling 21. Timed Events
    2.C / 2.2.2-auto-updating 21. Timed Events
    2.3.1 Three Flashes or Below Threshold 3.A / 2.3.1-flashing 9. Flashing
    2.4.1 Bypass Blocks 9.A / 2.4.1-bypass-function 4. Repetitive Content
    2.4.2 Page Titled

    12.A / 2.4.2-one-page-title-defined

    12.B / 2.4.2-page-title-purpose

    11. Page Titles
    2.4.3 Focus Order

    4.G / 2.4.3-focus-order-meaning

    4.H / 2.4.3-focus-order-reveal

    4.I / 2.4.3-focus-order-return

    3. Focus Order
    2.4.4 Link Purpose (In Context) 6.A / 2.4.4-link-purpose 14. Links
    2.4.5 Multiple Ways 19.A / 2.4.5-multiple-ways 23. Multiple Ways
    2.4.6 Headings and Labels 10.A / 2.4.6-heading-purpose 13. Content Structure
    2.4.7 Focus Visible 4.E / 2.4.7-focus-visible 2. Focus Visible
    3.1.1 Language of Page 11.A / 3.1.1-page-language-defined 15. Language
    3.1.2 Language of Parts 11.B / 3.1.2-part-language-defined 15. Language
    3.2.1 On Focus 4.F / 3.2.1-on-focus 3. Focus Order
    3.2.2 On Input 5.C / 3.2.2-on-input 10. Forms
    3.2.3 Consistent Navigation 9.B / 3.2.3-consistent- navigation 4. Repetitive Content
    3.2.4 Consistent Identification 9.C / 3.2.4-consistent-identification 4. Repetitive Content
    3.3.1 Error Identification 5.E / 3.3.1-error-identification 10. Forms
    3.3.2 Labels or Instructions 5.A / 3.3.2-input-instructions 10. Forms
    3.3.3 Error Suggestion 5.F / 3.3.3-error-suggestion 10. Forms
    3.3.4 Error Prevention (Legal, Financial, Data) 5.G / 3.3.4-error-prevention 10. Forms
    4.1.1 Parsing 20.A / 4.1.1-parsing 24. Parsing
    4.1.2 Name, Role, Value

    12.C / 4.1.2-frame-title

    12.D / 4.1.2-iframe-name

    2.D / 4.1.2-change-notify-auto

    5.B / 1.3.1-form-labels-cues

    5.D / 4.1.2-change-notify-form

    6.A / 2.4.4-link-purpose

    6.B / 4.1.2-change-notify-links

    7.A / 1.1.1-meaningful-image-name

    7.B / 1.1.1-decorative-image

    7.C / 1.1.1- decorative-background-image

    7.D / 1.1.1-captcha-alternative

    19. Frames and iFrames

    21. Timed Events

    10. Forms

    14. Links

    6. Images

    36 CFR 1194 503.4 User Controls for Captions and Audio Description 17.D / 503.4-caption-description-controls 17. Synchronized Media
    36 CFR 1194 503.4.1 Caption Controls 17.E / 503.4.1-caption-control 17. Synchronized Media
    36 CFR 1194 503.4.2 Audio Description Controls 17.F / 503.4.2-description-control 17. Synchronized Media
    WCAG Conformance Requirement 1. Conformance Level

    1.A / Alt-version-conformant

    1.B / Alt-version-equivalent

    1.C / Alt-version-access

    1.D / Alt-version-nc-access

    20. Alternate Versions
    WCAG Conformance Requirement 5. Non-Interference 1.E / non-interference 25. Non-Interference

    Appendix B: Document Change Log

    Note: Minor punctuation, formatting and spelling changes not included.

    Version 3.06, April 2013

    Original published version.

    Version 3.07, April 2013

    Location Change
    Section 2, Test Environment. Added “If an application automatically enables compatibility view, do not disable it. Make sure compatibility view is disabled before the testing another application.”
    Section 2, Platform, browser, testing tools Added Named Anchors Bookmarklet
    Test 1.2.1. Step 4 Added “(Do not include radio buttons and check boxes in this test.)”
    Test Process 8. Step 2c Added “(To avoid accidently disabling Sticky Keys during testing: Open Ease of Access Center > Set up Sticky Keys > Make the keyboard easier to use > Set up Sticky Keys > uncheck 'Turn off Sticky Keys when two keys are pressed at once.)”
    Test Process 8, Step notes Added to XP instructions “(Settings > uncheck 'Turn off Sticky Keys when two keys are pressed at once)”
    Test Process 13, Step 3a Deleted “and their targets”, replaced with “and #targets”
    Test Process 13, Step 3b Added “For Windows 7, WAT will identify the #targets on the page. For Windows XP, use the Named Anchors Bookmarklet to reveal the #targets on the page.”

    Version 3.08, May 2013

    Location Change
    Test 1.2.2. Step 4 Added “There are many ways to indicate that a field is required. Usually this is visually indicated by a star (*). This information needs to be directly associated with the input component via one of the above methods (adding a title attribute, including the '*' in the label, or through ARIA ‘required=true’).”
    Test 1.2, Failures Added “If there are no scripted elements, mark [22(l)] as Not Applicable.”

    Version 3.1, July 2013

    Location Change
    Section 2, Test Environment Added IE9
    Section 2, Web Accessibility Toolbar Settings

    Added “Some functions do not work on pages within Frames. Open each page in its own window before running WAT tool functions.

    Confirm all WAT popup messages by reviewing WAT markup on page.”

    Section 2, Platform, browser, testing tools Added Frames favelet
    Section 2: Compliance Tests Bullets changed to alphabetized items (Test ID) for each Failure
    Test 1.2.1, Failure C Added “interactive”
    Test 9.1, Failure A Added “correct default”
    Test 9.1, Failure B Added “correctly”
    Test Process 12, Step 1 Added “If WAT does not work correctly, use the Frames favelet.”

    Version 3.1.1, July 2013

    Location Change
    Test 1.2.2, Step 7 & Failure F Added test step and associated failure for a web form control not identifying its purpose.

    Version 3.2, August 2013

    Location Change
    Test 1.1,Step 2c Corrected cross-reference to TITLE (from 1d to 1e)
    Test 1.2.2, Step 2b. Moved “Fieldset and Legend may be used for grouping and associating two instructions to one input field.” (from 2e to 2b)
    Test 1.2.3, Step 3 Changed “Title information is displayed in the third column.” to “The third column displays the TITLE attribute of the <a> link. (To check for additional TITLE attributes, use WAT (Doc Info – Show Titles).”
    Test 1.2.3., Failure B Added “unique and”
    Test 2.1, Step 4 Added Sorted A to Z “or similar”
    Test 2.5, Note on Steps Corrected cross reference 2.3 to 2.2
    Test Process 5, Step 1 Added “The minimum requirement is the Web site and page description or Software application name and screen description.”
    Test Process 6, Step 1 Deleted “Use the WAT (Doc Info - List of Multimedia files) to find multimedia files.”
    Test Process 8, Added “Note: The navigation to access these features may differ depending on OS updates.”
    Test Process 8 Added more details to Steps 2c, 2d, 2e, 5c, and 5e test instructions.
    Test Process 8, Failure B Added “OS”
    Test Process 11, Failure D Added “or other confusing elements are”
    Test Process 12, Step 1 Added “Descriptions should be in plain language.”
    Test Process 13, Step 2 and Failure D Deleted “software screens”
    Section 3: All Tests Deleted all instructions for entering Compliant and Not Applicable results. Replaced with tables detailing how to enter results.

    Version 3.2.1, December 2013

    Location Change
    Section 3 and multiple

    Added naming convention for Section 3: Test Process

    Test Process 1 – Interactive Interface Elements (main category)

    Test 1.1 Keyboard Access (sub-category)

    Step 1a (instructions)

    Test ID 1.1.A.(failure condition)

    Test 2.2, Step 1c Added “If ARIA is used to describe an image, determine if the text description accurately describes the image’s purpose and/or function. The ARIA attribute may contain the text description (aria-label) or reference text on the page.” Deleted use of Inspect from this test.
    Test 2.3 Corrected reference to Sections 2.1, 2.2 in Related Requirements
    Test 3.2, Step 1 Test is for meaningful text and images of meaningful text only. Deleted “Visually examine meaningful text and images of meaningful text on the page for areas that may have low background to foreground contrast”
    Test 4, Step 1 Corrected reference to “Section 2.3 (Video-only and animation)”

    Version 3.3, February 2014

    Location Change
    Web Accessibility Toolbar: Installation advice Changed URL to point to September 2012 version.
    Web Accessibility Toolbar: Current version Added “Note: This is the current version for use in this Test Process. Later versions of WAT available from the official site (http://wat-c.org/tools/) may not function properly on DHS workstations.”
    Which Testing Tool should I use? Added “If it opens in a browser and WAT does not provide markup on form fields or images, check if it the ARIA favelet marks these elements.”
    Test 1.1, Step 1d Added “Hidden and disabled form fields are not interactive and do not require keyboard access.”
    Test 1.1, Step 2a Added “Visual focus is an indication of the keyboard focus location. Only keyboard accessible elements need to be analyzed for visual focus.”
    Test 1.1, Step 2d Added “Evaluate the existing focus order only; do not consider elements that do not receive keyboard focus.”
    Test 1.1, Step 2e Added “In IE8, visual focus is lost on frames. This is a failure.”
    Test 1.2.1, Step 1 Separated user controls (buttons and menus) from form field elements
    Test 1.2.2, Step 3a Added “ARIA (Accessible Rich Internet Applications) defines a method that specifies how to increase the accessibility of web pages, in particular, dynamic content and user interface components.”
    Test IDs 2.2.B, 2.2.C, 2.2.D Added to guidance “with ALT”
    Test 2.4 Corrected reference to Sections 2.1, 2.2 in Related Requirements
    Test 3.2, Step 1 Added “Include all appearances of text including changes due to mouse hover and status.”
    Test 9.2, Steps 2a and 2b Evaluate programmatic headings only.
    Test ID 9.2.B Removed “[NC] if 9.2.A is NC”
    Test Process 10, step 6 Added “with the cell data (through TITLE for example)”
    Test IDs 10.A and 10.C Added to DNA guidance “or only images of [complex] data tables.”

    Version 3.4, December 2014

    Location Change
    Section 2: Test Environment Added F12 developer tools instructions for standard view (“Another method to configure…”)
    Section 2: Test Environment, all tools Changed URL to Download from URL.
    Test 1.2.2, Failure B Added “[DNA] for a form field that fails 1.2.2.A.”
    Test 1.2.3, Step 1 Added “Framed content may need to be tested outside of the frame for this and other WAT tools. Use WAT (Frames – Navigate to framed documents) and then run the link test.”
    Test 2.2, Results Added to note: “(Where ALT is indicated, TITLE is also accepted.)”
    Test 2.2, Failure A Added to NC “(Do not evaluate an NC image for B, C, or D.)”
    Test 2.2, Failure B,C, & D Added to DNA: “with ALT or ARIA”
    Test 3.2, Failure A Deleted “Test ID always applies. [DNA] is not an acceptable result”. Replaced with “[DNA] if there is no text.”
    Test 5, Step 1 Added “when the page initially loads or after a page refresh (F5).”
    Test 13, Failure A Added “/link” to “clarify “if there is no method/link…”
    Test 13, Failure B

    Deleted “[NC] if the target of a skip function is not located after the repetitive content”. Replaced with “[NC] if there is no skip target.”

    Deleted “[C] if the target of all skip functions is after the repetitive content”. Replaced with “[C] if there is a target for all skip links”

    Test 13, Failure C Added to DNA text “[DNA] if 13.A or 13.B is NC.”
    Test 13, Failure C Deleted “[NC] if the skip function does not work properly.” Replaced with “[NC] if the skip link and target exist but the function does not work properly.”

    Version 3.4.1, May 2015

    Location Change
    Section 2: Test Environment Added IE11 as an acceptable browser
    Section 2: Test Environment Updated Inspect installation instructions

    Version 4.0, April 2017

    Location Change
    Title page Changed name of document from “DHS Section 508 Compliance Test Process …” to “Trusted Tester Section 508 Compliance Test Process …”
    Throughout

    Added notes regarding testing of Java, Inspect, and ARIA:

    • Web applications with Flash or embedded Java content should be tested in IE11 to determine the accessibility of the coded content.

    • Removed “DHS” where applicable.

    Section 2: Test Environment Added Windows 8.1 and 10 as supported OSes for testing. Added Firefox and Chrome as supported browsers for testing. Removed support for Windows XP and IE versions 8, 9. Added note that IE11 is most accessible test environment.
    Section 2: Test Environment Updated configuration instructions for Inspect and Ferret. Added clarification regarding categorization of “software”. Added sections for WAF and Colour Contrast Analyzer to Tools section. Moved installation instructions to tools installation document. Removed Named Anchor Bookmarklet, since XP is no longer supported.
    Section 1, Issues that are Not Covered in this Test Process Removed two examples from “Issues that are not covered in this test process”
    Section 2, Platforms, Browsers, Testing Tools Rearranged content.
    Section 2, Testing Tools Moved tools installation and settings information to Trusted Tester Test Environment Installation and Configuration Guide
    Section 2, Testing Tools Added WAT version 2015. Added Thatcher’s Skip Links favelet.
    Various Removed support for Windows XP and IE versions 8, 9
    Test 1.2.1 (SW interactive elements) Added instructions to use mouse for keyboard-inaccessible elements. Added instructions to check for required field indicators.
    Test 1.2.2 (Forms) Added information about checking HTML version and ID naming in HTML5. Changed test for ARIA elements to use ARIA favelet instead of Inspect. Added check for “required=true”.
    Test 3.2 (Color Contrast)

    Added clarification that contrast ratio of 4.5:1 is the minimum.

    Added exemption for incidental text.

    Results A: Specified that tests apply to non-incidental text only.

    Applicable Baseline Requirements: Clarification of minimum ratio.

    Test 7 (Timeouts) Added reminder that web applications with Flash or embedded Java content must be tested in IE11.
    Test 8 (Built-in Accessibility Features)

    Updated instructions to use [Windows key + U] keyboard shortcut for opening Ease of Access center for more consistency. Updated instructions to include testing in Windows 8.1 and Windows 10.

    Added reminder that web applications with Flash or embedded Java content must be tested in IE11.

    Updated instructions for triggering sound. Removed Windows XP information.

    Test 12 (Web Frames)

    Added note for testing content within a frame and possible use of the Navigate to Frames function applies to WAT only (not WAF).

    Added notes for WAF testing of frames.

    Test 13 (Repetitive Content)

    Intro paragraph: added “methods such as” to clarify that internal links are just one example of how skip functions may be provided.

    Note added: F6 does not navigate frames in Chrome, which does not have a standard frame navigation keystroke.

    Removed instructions for Windows XP (use of Named Anchors Bookmarklet).

    Added: If the Skip Link tool does not show skip targets in the expected locations, determine the skip target location with the subsequent steps in the testing process (refresh the page, tab to the skip function, activate it using Enter key).

    Added instructions for if no skip links or targets are revealed.

    Added instructions for checking where the focus moves after activating the skip function.

    Version 5.0, November 2018

    Version 5.0 is a wholesale change to the test process and supporting content. The primary change to the document includes significant changes to the test processes to address the Section 508 Refresh and incorporation of the WCAG 2.0 Level A and Level AA Success Criteria. However, the document also includes a number of other changes to improve the organization and flow of the document, test condition logic, and readability of test process instructions. These changes include:

    • General alignment of Test Condition construction with the draft W3C Accessibility Conformance Testing (ACT) Task Force Test Rules Format.

    • Removed references to Failure Conditions and reconstructed as Test Conditions with positive pass/fail construction to eliminate double negatives.

    • More straightforward Test Conditions and explanation for when tests do not apply. The revised test process includes conditions for identifying content to test. If those conditions are not met, the test does not apply.

    • Test sections re-grouped and/or renamed to use terms that are not mutually exclusive.

    • Clearer alignment of “How to Test” instructions to each Test Condition.

    • Addition of unique and descriptive test names to help identify the Test Condition without having to look up the Test Condition by Test ID.

    Finally, the test process has adopted a new testing tool: the Accessible Name and Description Inspector (ANDI). ANDI replaces all of the necessary functionalities previously provided by the Web Accessibility Favelets (WAF). ANDI also adds a significant number of new functions and features to facilitate the Trusted Tester process.

    Appendix C: Test Process Quick Reference

    Quick Reference with Test Conditions

    Test ID Test Name Test Condition
    1.A alt-version-conformant An alternate version passes all applicable Test Conditions in this test process.
    1.B alt-version-equivalent The accessible version is up to date with the same information and functionality.
    1.C alt-version-access The mechanism to reach the accessible equivalent version from the non-conforming page is accessible.
    1.D alt-version-nc-access The non-conforming version(s) can only be reached from conforming content.
    1.E non-interference Content in the non-conforming version(s) does not interfere with the user’s ability to access or use the conforming content.
    2.A 1.4.2-audio-control The user can pause, stop, or control the volume of audio content that plays automatically.
    2.B 2.2.2-blinking-moving-scrolling The user can pause, stop, or hide moving, blinking, or scrolling content.
    2.C 2.2.2-auto-updating The user can pause, stop, hide, or control the frequency of automatically updating content.
    2.D 4.1.2-change-notify-auto The page provides notification of each automatic update/change in content.
    3.A 2.3.1-flashing If NO flashing content is found, then this Test Condition DOES NOT APPLY (DNA). If flashing content IS found, then this test should be recorded as NOT TESTED.
    4.A 2.1.1-keyboard-access All functionality can be accessed and executed using only the keyboard.
    4.B 2.1.1-no-keystroke-timing Individual keystrokes do not require specific timings for activation of functionality.
    4.C 2.1.2-no-keyboard-trap There is no keyboard trap.
    4.D 2.1.1-title-keyboard-access Title attribute information that is essential or required to complete an activity can be accessed using only the keyboard.
    4.E 2.4.7-focus-visible A visible indication of focus is provided when focus is on the interface component.
    4.F 3.2.1-on-focus When an interface component receives focus, it does not initiate an unexpected change of context.
    4.G 2.4.3-focus-order-meaning The focus order preserves the meaning and operability of the web page.
    4.H 2.4.3-focus-order-reveal Focus is moved to revealed content.
    4.I 2.4.3-focus-order-return Focus is returned to the logical sequence.
    5.A 3.3.2-input-instructions Labels and instructions for each form input inform users what input data is expected and, if applicable, what format is required.
    5.B 1.3.1-form-labels-cues The combination of the accessible name, accessible description, and other programmatic associations (e.g., table column and/or row associations) describes each input field and includes all relevant instructions and cues (textual and graphical).
    5.C 3.2.2-on-input Changing field values/selections (e.g., entering data in a text field, changing a radio button selection) does NOT initiate an unexpected change of context.
    5.D 4.1.2-change-notify-form The page provides notification of each form-related change in content.
    5.E 3.3.1-error-identification The item in error is identified and the error is described to the user in text.
    5.F 3.3.3-error-suggestion Additional guidance (e.g., suggestion for corrected input) is provided about how to correct errors for form fields.
    5.G 3.3.4-error-prevention The web page allows the user to check, reverse, and/or confirm submission.
    6.A 2.4.4-link-purpose The purpose of each link or button can be determined from any combination of the link/button text, accessible name, accessible description, and/or programmatically determined link/button context.
    6.B 4.1.2-change-notify-links The page provides notification of each change in content that is the result of interaction with a link or button.
    7.A 1.1.1-meaningful-image-name The accessible name and accessible description for a meaningful image provides an equivalent description of the image.
    7.B 1.1.1-decorative-image There is no accessible name and accessible description for a decorative image.
    7.C 1.1.1- decorative-background-image The background image is not the only means used to convey important information.
    7.D 1.1.1-captcha-alternative Alternative forms of CAPTCHA are provided.
    7.E 1.4.5-image-of-text The image of text cannot be replaced by text or is customizable.
    8.A 2.2.1-timing-adjustable The user can turn off, adjust, or extend the time limit.
    9.A 2.4.1-bypass-function A keyboard-accessible method is provided to bypass repetitive content.
    9.B 3.2.3-consistent- navigation Each navigational element occurs in the same relative order with regard to other repeated components on each web page where it appears.
    9.C 3.2.4-consistent-identification The accessible name and description is consistent for components that perform the same function.
    10.A 2.4.6-heading-purpose Each heading describes the topic or purpose of its content.
    10.B 1.3.1-heading-determinable Each programmatically determinable heading is a visual heading and each visual heading is programmatically determinable.
    10.C 1.3.1-heading-level Programmatic heading levels logically match the visual heading presentation within the heading structure.
    10.D 1.3.1-list-type All visually apparent lists are programmatically identified according to their type.
    11.A 3.1.1-page-language-defined The default human language of each web page can be programmatically determined.
    11.B 3.1.2-part-language-defined The human language for any content segment that differs from the default human language of the page can be programmatically determined.
    12.A 2.4.2-one-page-title-defined One <title> element is defined for the web page.
    12.B 2.4.2-page-title-purpose The <title> element identifies the contents or purpose of the web page.
    12.C 4.1.2-frame-title Each <frame> has a title attribute that describes its content.
    12.D 4.1.2-iframe-name The combination of accessible name and description for each <iframe> describes its content.
    13.A 1.4.1-color-meaning Color is not used as the only visual means of conveying information, indicating an action, prompting a response, or distinguishing a visual element.
    13.B 1.3.3-sensory-info Instructions provided for understanding and operating content do not rely solely on sensory characteristics of components, such as shape, size, visual location, orientation, or sound.
    13.C 1.4.3-contrast The visual presentation of text and images of text have sufficient contrast.
    14.A 1.3.1-table-identification Each data table has programmatic markup to identify it as a table.
    14.B 1.3.1-cell-header-association All data cells are programmatically associated with relevant headers.
    14.C 1.3.1-layout-table-structure The layout table DOES NOT designate the layout table using ARIA role="table" AND DOES NOT include table header structure and relationship elements and/or associated attributes.
    15.A 1.3.1-meaningful-content-css-before-after For the meaningful content provided via CSS pseudo-elements ::before and ::after, equivalent information is available in another way.
    15.B 1.3.2-content-order-CSS-position The reading order of the content (in context) is correct without CSS positioning.
    15.C 1.3.2-content-meaning-CSS-position The meaning of the content (in context) is preserved without CSS positioning.
    16.A 1.2.1-audio-transcript-text A text-based alternative is provided for audio-only content that provides an accurate and complete representation of the audio-only content.
    16.B 1.2.1-video- alternative-equivalent The video-only content information is also available through an equivalent text or audio alternative.
    17.A 1.2.2-captions-equivalent The multimedia provides accurate captions for the audio content.
    17.B 1.2.5-audio-description-equivalent The multimedia provides an equivalent soundtrack (combination of narration and audio descriptions) for the video content.
    17.C 1.2.4-captions-live-equivalent The live multimedia provides accurate captions for the audio content.
    17.D 503.4-caption-description-controls The media player provides user controls for closed captions and audio descriptions.
    17.E 503.4.1-caption-control User controls for captions are provided at the same menu level as the user controls for volume or program selection.
    17.F 503.4.2-description-control User controls for audio descriptions are provided at the same menu level as the user controls for volume or program selection.
    18.A 1.4.4-resize-text There is a mechanism to resize, scale, or zoom in on the text to at least 200% of its original size without loss of content or functionality.
    19.A 2.4.5-multiple-ways There are two or more ways to locate a web page within a set of web pages.
    20.A 4.1.1-parsing This test should be recorded as NOT TESTED.

    One-Page Quick Reference – Test Names Only

    Test ID Test Name
    1.A alt-version-conformant
    1.B alt-version-equivalent
    1.C alt-version-access
    1.D alt-version-nc-access
    1.E non-interference
    2.A 1.4.2-audio-control
    2.B 2.2.2-blinking-moving-scrolling
    2.C 2.2.2-auto-updating
    2.D 4.1.2-change-notify-auto
    3.A 2.3.1-flashing
    4.A 2.1.1-keyboard-access
    4.B 2.1.1-no-keystroke-timing
    4.C 2.1.2-no-keyboard-trap
    4.D 2.1.1-title-keyboard-access
    4.E 2.4.7-focus-visible
    4.F 3.2.1-on-focus
    4.G 2.4.3-focus-order-meaning
    4.H 2.4.3-focus-order-reveal
    4.I 2.4.3-focus-order-return
    5.A 3.3.2-input-instructions
    5.B 1.3.1-form-labels-cues
    5.C 3.2.2-on-input
    5.D 4.1.2-change-notify-form
    5.E 3.3.1-error-identification
    5.F 3.3.3-error-suggestion
    5.G 3.3.4-error-prevention
    6.A 2.4.4-link-purpose
    6.B 4.1.2-change-notify-links
    7.A 1.1.1-meaningful-image-name
    7.B 1.1.1-decorative-image
    7.C 1.1.1- decorative-background-image
    7.D 1.1.1-captcha-alternative
    7.E 1.4.5-image-of-text
    8.A 2.2.1-timing-adjustable
    9.A 2.4.1-bypass-function
    9.B 3.2.3-consistent- navigation
    9.C 3.2.4-consistent-identification
    10.A 2.4.6-heading-purpose
    10.B 1.3.1-heading-determinable
    10.C 1.3.1-heading-level
    10.D 1.3.1-list-type
    11.A 3.1.1-page-language-defined
    11.B 3.1.2-part-language-defined
    12.A 2.4.2-one-page-title-defined
    12.B 2.4.2-page-title-purpose
    12.C 4.1.2-frame-title
    12.D 4.1.2-iframe-name
    13.A 1.4.1-color-meaning
    13.B 1.3.3-sensory-info
    13.C 1.4.3-contrast
    14.A 1.3.1-table-identification
    14.B 1.3.1-cell-header-association
    14.C 1.3.1-layout-table-structure
    15.A 1.3.1-meaningful-content-css-before-after
    15.B 1.3.2-content-order-CSS-position
    15.C 1.3.2-content-meaning-CSS-position
    16.A 1.2.1-audio-transcript-text
    16.B 1.2.1-video- alternative-equivalent
    17.A 1.2.2-captions-equivalent
    17.B 1.2.5-audio-description-equivalent
    17.C 1.2.4-captions-live-equivalent
    17.D 503.4-caption-description-controls
    17.E 503.4.1-caption-control
    17.F 503.4.2-description-control
    18.A 1.4.4-resize-text
    19.A 2.4.5-multiple-ways
    20.A 4.1.1-parsing