Contents

DOCUMENT OVERVIEW

I. PRIORITIZATION

Identify Systems

Score Systems

Build Inventory

Rank & Prioritize

II. PLANNING

Define Scope

Understand Standards & Law

Identify Resources

Participants & Skill Sets

Sample Set for Assessment

Additional Funding or Resources

Follow Best Practices

Level of Effort

Third party accessibility assessment services

Logging Defects & Deficiencies

Communication Plan

III. ASSESSMENT

Methods & Procedures

Automated Accessibility Scans

Manual Assessment

Assessment Procedures

Assessment Setup and Support

Facilities and Equipment

Consent Form

Introduction to a Testing Session

Questionnaire on Computer Use, System and Internet Experience

Debriefing Questions

Metrics & Reporting

Assessment Results

IV. REMEDIATION

Evaluating Incidents or Defects

Prioritization

Mitigation

V. Resources

Access IT California: Online Resources

Laws and Policies

Accessibility Standards

Useful Links

Useful References

DOCUMENT OVERVIEW

The mission of the Access Information Technology (IT) California project is to provide effective education materials that will allow state departments to easily adopt key accessibility concepts and maximize opportunities for everyone by providing the best possible service and experience to the people of California.

Making a department’s inventory of existing systems usable and accessible can be challenging. To simplify the assessment and remediation process for California State departments, the Access IT California project team created the Accessibility Compliance and Remediation Methodology (ACRM). This document details a strategy and plan of actionto address accessibility in existing systems using a four-phased approach:

ACRM aims to provide a repeatable process that will help departments:

  • Prioritize
  • Review their inventory of systems
  • Prioritize systems for usability and accessibility assessments
  • Plan
  • Plan and perform usability and accessibility assessments
  • Assess
  • Outline a remediation plan for systems
  • Remediate
  • Incorporate accessibility into ongoing IT operations and maintenance
  • Comply with State of California accessibility statutes and policies (State Administrative Manual (SAM) 4833; California Government Code Section 7405)

I. PRIORITIZATION

Addressing accessibility for a system inventory can be intimidating. With limited resources and a large inventory, it can be hard to determine which system(s) to evaluate first. To help departments identify which systems should be evaluated first, the Access IT team developed Accessibility Exposure & Impact Score (AEIS).

AEIS is a technology-agnostic evaluation that focuses on the purpose of a system, the services that it provides to citizens, and impact to work performed by State staff. AEIS also takes modernization efforts and State/Federal mandates into consideration. By focusing on these factors, AEIS highlights systems that have the largest impact to the population.

To provide departments with a quick and easy way to prioritize their existing systems for accessibility and usability testing, the Access IT team created the AEIS scoresheet. Using the scoresheet, department staff will answer a short set of “Yes/No” questions. After answering all the questions a score (out of 100) signifying the impact significance of the system will be provided. A higher score means that the system has a greater risk for accessibility issues. A sorted listing of the scores will identify which systems should be evaluated first.

In addition to providing departments with an easy form to evaluate a systems inventory, the AEIS scoresheet also provides additional tools that can be used to compare systems and justify an assessment effort. In addition to the score, after answering allquestions, a “justification” form is also generated. A completed form can be used to provide stakeholders and executive staff with a quick explanation about the system, its services, and an explanation for why it should be prioritized for an assessment.

AEIS was developed using Microsoft Excel. There are also two versions available. The manual version is a simple, formula-based spreadsheet which will provide departments with the Scorecard and Justification form. The automated version that utilizes macros to allow users to score multiple applications in the same file and performs additional text and graphicbased analysis on the calculated data.

To simplify the prioritization process, departments are recommended to use the automated version of AEIS as it contains a System Inventory and a graph-based Analysis worksheet. To gain access to the all the functionality that AEIS offers, users will need to enable macros. If that option is not available, then users will need to use the manual version of AEIS. The manual version does not require macros, but it only provides users with the Scorecard, Text Analysis, and Justification sheets.

Identify Systems

The first step in evaluating the department’s inventory is to identify which systems will be scored using AEIS. Departments should start by compiling a list of systems that are actively used. For each of these systems, the team that will be scoring the systems should know basic information about the system, including: service(s) the system provides, what it is used for, who the users are, whether it has been tested for accessibility in the past, and if there is a plan for the system to be retired.

Score Systems

Included in the Inventory is where information on each of the scored systems is captured. In addition to tracking the categorical scoring breakdown for each of the systems evaluated, users can also adjust the output for the “Analysis” and “Text Analysis” tabs by changing the number in the green cell (F3) then clicking the “Refresh Data” button.

Entries in the inventory may also be manually added or removed. However, once changes are made, the “Refresh Data” button needs to be clicked to reset the form.

The Scorecard is an interface where users will answer ten questions about the system being evaluated. After all questions are answered, the system will be assessed a score out of 100 points.

If using the Automated version of AEIS, additional buttons will be available that will allow users to clear/reset the form (“Clear Form”) or save the questionnaire results and add them to the internal system inventory (“Add to Inventory”).

Scorecard is included in both the automated and manual versions of AEIS.

Build Inventory

The Inventory is where information on each of the scored systems is captured. In addition to tracking the categorical scoring breakdown for each of the systems evaluated, users can also adjust the output for the “Analysis” and “Text Analysis” tabs by changing the number in the green cell (F3) then clicking the “Refresh Data” button.

Entries in the inventory may also be manually added or removed. However, once changes are made, the “Refresh Data” button needs to be clicked to reset the form.

Inventory is only included inthe automated versions of AEIS.

Rank & Prioritize

The inventory Analysis worksheet provides a bar chart and table listing the highest scored systems and their categorical scoring breakdown. Systems are sorted by their overall AEIS score in descending order. The number of systems displayed is based on the count defined on the Inventory tab. For information on how to adjust the output, refer to the Build Inventory section of the ACRM.

Analysis is only included inthe automated versions of AEIS.

Like the output displayed in the Analysis worksheet, the Text Analysis sheet displays a written description of the AEIS scoring breakdown (including total and categorical scores) but only uses text. This functionality is included in both the automated and manual versions of AEIS, but differs slightly depending on the version being used.

Text Analysis is included in both the automated and manual versions of AEIS.

In the automated version, AEIS will output information about all highest scored systems. The number displayed is defined on the Inventory worksheet. For information on how to adjust the output, refer to the Build Inventory section of the ACRM. The manual version of AEIS will only include information on the singular system that is scored.

The Justification form was created so that staff evaluating their systems can create a justification for executive management about why an accessibility assessment should be performed on the system. This will allow management and executives to review a one page document that summarizes the services that the system provides, why an assessment should be prioritized, and what the impact of the system not being accessible would be.

By default, the Justification form will include the system name and AEIS scoring breakdown. The form also adds blank fields where information can be added to explain information not captured in AEIS that will be useful in determining whether an accessibility assessment of the system should be performed.

Justification is included in both the automated and manual versions of AEIS.

The form is generated for each system evaluated. In the automated version, a separate worksheet will be created for each system scored. In the manual version, there is a dedicated tab.

II. PLANNING

It’s important to effectively plan prior to starting anaccessibility assessment. The assessment planning documents will help determine tasks and work needed, how to approach and implement the work practically, what standards you will use, the number or resources required and how to communicate the results.

Determine the overall purpose before structuring the assessment. The purpose influences every other aspect of the assessment. For example, the purpose may be to address accessibility concerns with existing systems proactively or it could be in response to and prepare for an upcoming audit.

Follow these stepstoeffectively plan for the accessibility assessment:

  • Define Scope – Indicate what will be included and excluded in the assessment.
  • Choose Standard – Identify the accessibility standards to use for the assessment.
  • Identify Resources – Determine the resources needed for assessment.
  • Participants & Skill Sets – Define the project team, identify skill sets and roles.
  • Sample – Determine the sample size and how to obtain the sample.
  • Follow Best Practices
  • Determine Level of effort – Assess the level of effort for the assessment.
  • Define Error Logging – Define information needed to effectively document defects & deficiencies identified during assessment.
  • Communication Plan – Document communication strategy to effectively communicate the results of assessment.
  • Request for Additional Resources– Determine whether additional funds and/or resources are needed for the assessment effort.

Define Scope

Document what to include and/or exclude from the assessment. To better focus the assessment, it is important to have a thorough understanding of the business process that the system supports. Define the boundaries of the assessment and consider:

  • Stakeholder needs
  • Essential system functions
  • Critical paths and/or frequent use cases

Understand Standards & Law

Section 508 of Rehabilitation Act of 1973(29 U.S.C. § 794 (d)) - To require Federal agencies to make their electronic and information technology (EIT) accessible to people with disabilities. This law and applies when they develop, procure, maintain, or use electronic and information technology. This law was refreshed in 2017.

  • The standards for this law were refreshed 2017:Standard Refresh and Final Rule and requirethe rule references Level A and Level AA Success Criteria and Conformance Requirements of WCAG 2.0

California Government Code 7405 – The code content was previously included in Section 11135, requires all state agencies and departments to comply with Section 508, and it additionally requires any “entity that contracts with a state or local entity” to respond to any and resolve any complaint raisedbecause of the implementation of products or services.

California State Administrative Manual Section 4833 – Information Technology Accessibility Policy also requires state agencies and departments to comply with Section 508.

Statewide Information Management Manual (SIMM) – Section 25 IT Accessibility Resource Guide -Was updated in July of 2016 through a joint effort with California Department of Technology, Government Operations Agency, Health and Human Services Agency, and the Department of Rehabilitation.Section 25aligns with Web Content Accessibility Guideline (WCAG) 2.0 Level AA conformancein addition to the requirements of Section 508. This resource provides information to state entities in meeting requirements for accessible web, information technology (IT) projects, and digital content creation.

Web Content Accessibility Guidelines 2.0 - Additionally, it is recommended that departments follow the Web Content Accessibility Guidelines (WCAG 2.0) with level AA conformance. WCAG 2.0 isthe internationally recognized standards and guidelines for web accessibility. By following the guidelines listed in WCAG 2.0, departments will also meet all accessibility requirements listed in Section 508.

Identify Resources

ParticipantsSkill Sets

Define the project team and determine roles. Identify participants: Developers, Enterprise Architects, Application Managers, Business Analysts, Testers and Subject Matter Experts (SME).

Assessing IT systems requires a deep understanding of accessible design principles and real-world accessibility issues. For departments with limited expertise, it is highly recommended to grow competencies within the organization to address accessibility. This can be accomplished by working with end users of assistive technology. Other alternatives include training and familiarization with standards and laws. Iif necessary, assessment services may be outsourced.

For more information, visit the websites for DOR ( and California State University, Northridge: Center on Disabilities (

Sample Set for Assessment

Determine the sample needed, the sample size and approach. The samples should include different types of content and functions available on the system. The following should be considered:

  • Frequently accessed pages based on data collected by your analytics
  • Regularly used pages should have specific use cases to ensure thorough testing of system functionality.
  • Common and essential documentation and reports should be assessed.
  • Tables, structured content (content with headings and/or sections), forms, images, different user interface controls such as tab interfaces and expanding menus, frames, multimedia (audio or video), and content that moves or changes with time or by user action
  • Identify how different content is tested and evaluated to ensure accessibility.
  • For example: all images should have their “alternative text” descriptions validated.
  • The “alternative text” should be accurate, succinct, and not redundant.

Additional Funding or Resources

If it is determined available resources are not enough to support the assessment and remediation process, departments may want to consider hiring or contracting additional resources. Some departments have found it beneficial to have a dedicated usability and accessibility team to ensure both aspects are continuously addressed for critical systems. If that route is to be taken, start by assessing the funding needs and determine whether a BCP is required. Follow the BCP process to request funding to perform the assessment. Review the guidelines set by the governing agency and/or Department of Finance (DOF) regarding the BCP process.

Follow Best Practices

Level of Effort

To determine the level of effort for system assessment, consider accessibility testing standard requirements, system complexity and size, testing methodology and knowledge and skill set requirements.

Additional factors to consider when estimating level of effort:

Page counts

Determine the number of pages within the system. The pages should then be classified as simple, medium, and complex. The use cases can assist in classifying pages. This will determine the number of pages or screens needed to be tested and quantify the scope per system.

Software and Tools

Determine the assistive technologies that will be used during the assessment. The department should use tools that address visual, audio, kinetic, speech, and cognitive impairments. For example: screen readers, speech recognition, captioning, and magnification software.

Departments may also employ automated accessibility testing tools. Automated tools evaluate compliance with accessibility standards but will not guarantee that the system is usable. It is strongly recommended to evaluateconformance levels manuallyto ensure that the system has a high-level of usability.

Assessment software and tools:


Third party accessibility assessment services

Many vendors provide accessibility assessment services. California Department of Rehabilitation (DOR) provides a State Price Schedule (SPS) for Assistive Technologies (AT) to make ATand related services available to California State departments:

Check the market frequently, as several vendors are growing their accessibility services in response to demand from the public.

Logging Defects & Deficiencies

To help with the recording and logging of system defects and deficiencies that will need to be addressed, it is important that the error logging effort be defined in advance. Consider what information the technical team will need to know and include that information in the error reporting forms that will be provided to the users performing the assessment.

The technical staff who will be assisting with the assessment and remediation of the issues should consider what information that they would like to receive from the users performing the assessment and what type of reports that they want to utilize after receiving the information.

The assessment planning team should plan their input forms based on the desired output deliverables that they will need to create. Consider including surveys and questionnaires that also inquire about the user experience.

Outputs from the assessment may include:

  • Session Logs
  • Issue Log
  • Usability Questionnaire

Communication Plan

Define the communication strategy and timely communicate the accessibility assessment results to all stakeholders, including, but not limited to:

  • Executives
  • Technical staff
  • Accessibility Committee (if available)

Summarize and format the results so they are meaningful for the intended purpose and intended audience. For example, provide detailed reports for technical staff and a high level executive summary for executives.