音声ブラウザご使用の方向け: SKIP NAVI GOTO NAVI

BUILDING ORGANIZATIONAL AWARENESS AND EXPERTISE IN WEBACCESSIBILITY

Eric G. Hansen
Educational Testing Service
ETS 12-R
Princeton, NJ 08541
Voice: 609-734-5615
Internet: ehansen@ets.org
FAX 609-734-1090

Douglas C. Forer
Educational Testing Service
ETS 20-R
Princeton, NJ 08541
Voice: 609-734-5713
Internet: dforer@ets.org

Web Posted on: November 30, 1997


INTRODUCTION

Most people involved in developing or maintaining corporate web sites lack specialized training or information for making sites accessible to people with disabilities. For example, graphics may lack alternate textual descriptions, or screen layouts may be cumbersome to navigate using screen readers. Fortunately, there are concrete steps that can be taken to increase organizational awareness and expertise.

This paper describes how ETS has undertaken to improve staff expertise and awareness regarding web accessibility. We believe that some features of our approach will be helpful to others.

Educational Testing Service (ETS) has a strong commitment to making its products and services accessible to people with disabilities. For example, ETS and its programs (GRE, TOEFL, SAT, etc.) provide human readers and amanuenses, braille and large print versions of tests, extra time, adaptive technology, and other accommodations to eligible takers of computer-based or paper-based tests. Yet, delivering test information, including practice tests, over the web presents new challenges and opportunities with regard to accessibility.


THE PROJECT

In order to achieve the goal of raising the level of expertise and awareness at ETS about web accessibility for users with visual disabilities, the authors proposed an internal ETS project. The major strategy was to involve project team members in preparations for the evaluation and, in certain cases, as evaluators. The project was intended to provide practical guidance to those charged with the design, development, and maintenance of the ETS web site.

We limited the scope of the project in order to better achieve concrete and practical outcomes. Our focus was on visual disabilities rather than all disabilities. The project would address certain frequently visited portions of the ETS web site (e.g., the home page and GRE sample questions, etc.). It would use a selected subset of access tools (a few popular screen readers and access tools with popular browsers and using the DecTalk speech synthesizer). The project was internally funded by ETS.

Features of the Web Access project which seem important to its success include the following.

  • A. Diverse Background and Representation from Team Members, Consultants, and Other Participants. The Web Access team, which currently has eight regular members, includes members from different areas within ETS and having expertise in publishing (paper and electronic), systems analysis, user interface design, instructional design, disability access, research design, and other domains. One Web Access team member is also a member of, and serves as a liaison to, the ETS Net team, which develops and maintains the ETS web site. There are individuals who have print-related disabilities (including blindness) serving as team members and consultants on the project. Their personal base of knowledge and experience play a central role in the project. Team meetings occur approximately twice per month.
  • B. A Systematic Evaluation Methodology. The project is employing a systematic evaluation methodology to provide constructive feedback on the ETS web site. We began by reviewing existing guidelines for web accessibility (DO-IT, 1996; Fontaine, 1995; CPB/WGBH NCAM, 1996; Vanderheiden, 1996) as well as an automatic verifier of certain web site accessibility features (CAST, 1996). We also selected the set of screen readers and access software, screen magnification software, web browsers, and speech hardware that would be used in the evaluation. We selected a particular kind of evaluation methodology called "heuristic evaluation" that has been shown to be cost-effective in a variety of software user-interface evaluation situations (Nielsen, 1994a; Nielson, 1994b; Levi and Conrad, 1996). Based on a review of "heuristics" (i.e., design guidelines) used by Levi and Conrad (1996) as well as on our own intuitions and the advice of consultants, team members have developed a draft version of a set of five heuristics tailored to this evaluation. Examples of heuristics (in draft form) are: "Enable all essential information to be accessed auditorially," "Use explicit labeling of content and structural elements," and "Minimize visual strain"). Under each major heuristic there are usually several specific subpoints. Three or more usability experts will then individually identify potential usability problems, categorizing each problem under the heading of a heuristic that it violated. The three usability experts are individuals who have visual disabilities and also have the requisite familiarity with the web, screen readers, and related technology. After each evaluator has generated a list of problems, they will meet to assemble a composite list and to discuss possible improvements or solutions. The evaluators will then independently assign severity ratings to each of the problems (0: I don't agree that this is a usability problem; 1: Cosmetic problem only; 2: Minor usability problem; 3: Major usability problem; 4: Usability catastrophe). The process may require only a few hours of time from each evaluator. The results will then be analyzed and reported to the ETS Net team and others.
  • C. A Variety of Ways for Achieving Impact. The project is using a variety of ways to achieve a positive impact on the organization, including sharing draft versions of the evaluation heuristics with the ETS Net team to help them integrate these guidelines into practice. Dissemination of information among team members is facilitated by software program that not only distributes email to all team members but also saves the project- related messages where they can be accessed by team members using their web browsers. The Web Access team members, by their acquaintance with technologies and issues are becoming resources to the corporation and have been invited to share their expertise in other parts of the corporation. Seminars will provide additional opportunities to raise staff awareness and expertise.

At the CSUN meeting in March 1997, we will report preliminary outcomes for the heuristic evaluation and provide and a demonstration of the evaluation process.


REFERENCES

Center for Applied Special Technology [CAST]. (1996). Bobby manual. Peabody, MA: CAST. (http://www.cast.org)

CPB/WGBH NCAM [National Center for Accessible Media] (1996). (http://www.wgbh.org)

Disabilities, Opportunities, Internetworking & Technology [DO-IT]. (1996). Universal design of World Wide Web pages. (Version: 12 November 1996). Seattle, WA: University of Washington. (http://weber.u.washington.edu/~doit/)

Fontaine, P. (1995). Writing accessible HTML documents. Washington DC: Center for Information Technology Assessment, General Services Administration. (http://www.gsa.gov/coca/WWWcode.htm)

Levi, M. D., and Conrad, F. G. (1996). A heuristic evaluation of a World Wide Web prototype. Interactions of the ACM, July/August 1996, 50-61.

Nielsen, J. (1994a). Heuristic evaluation. In Nielsen, J. and Mack, R. (Eds.) Usability inspection methods. New York, NY: John Wiley & Sons.

Nielsen, J. (1994b). Usability inspection methods. Tutorial at the Conference on Human Factors in Computing Systems, Boston, MA (24-28 April 1994).

Vanderheiden, G. C. (1996). Design of HTML pages to increase their accessibility to people with disabilities (Version 6.6, 30 May 1996). Madison, WI: Trace R&D Center, University of Wisconsin--Madison. (http://trace.wisc.edu)