音声ブラウザご使用の方向け: SKIP NAVI GOTO NAVI

Web Posted on: December 31, 1998


Cheryl Trepagnier, Ph.D.
(202) 877-1487

Michael J. Rosen, Ph.D.
(202) 877-1960

Assistive Technology Research Center
National Rehabilitation Hospital
102 Irving Street NW
Washington DC 20010

Corinna Lathan, Ph.D.
(202) 319-5095
Department of Biomedical Engineering
Catholic University of America
Washington DC 20064


A collaborative group of investigators from Catholic University of America, National Rehabilitation Hospital and Sister Kenny Institute was funded on October 1, 1998 to pursue research and development in the field of telerehabilitation, and, in addition, to undertake projects on rehabilitative applications of virtual reality technology. This paper presents interim results and demonstrations of technology from four projects:

Prototype Personal Augmentation Devices (PADs), individualized devices controlled by physiologic signals. For children with severe motor restrictions, PADs can serve as means of exploring the environment, and can also provide a rich data stream for cognitive evaluation;

Rehabilitative telemonitoring, an approach to continuing the rehabilitative process after discharge from the rehabilitation hospital. Objective data representing functional status can be acquired in an unobtrusive manner and transmitted to the clinician in a store-and-forward mode. In addition, clinicians can interact with patients to advise on, monitor and demonstrate ADLs using a home-like environment at the rehabilitation hospital;

Assessment and therapy using video telecommunications and touch screens. Special purpose graphics can make the touch screen a response panel for a variety of types of assessment as well as a response mode for persons with speech or language deficits; and Virtual reality (VR) and gaze sensing technology to improve our understanding of deficits in autism and in left visual-spatial neglect, and to inform development of rehabilitative interventions.


It is widely recognized that play has an important function in many aspects of children's cognitive and language development (e.g., Ogura, 1991). This has been part of the rationale for many assistive technology interventions (e.g., Trefler & Cook, 1986) as well as virtual reality applications for children (e.g., Stanton et al., 1996). Similarly, the objective of PADs is to provide motor-disabled children with opportunities to navigate and manipulate the external environment in order to promote curiosity and exploratory behavior, foster cognitive development, and reduce passivity and learned helplessness. What makes them "personal" is that they will be designed to respond to any voluntary physiological signal, e.g. myoelectric activity, eye blink or tongue pressure, rather than coordinated limb movements operating traditional interfaces such as keyboards and mouses.

Prototype PADs and prelimin data on their acceptance and usability will be presented.


Rehabilitation has traditionally been delivered in medical settings. Its goal, however, is application to real-life function. For the clinician monitoring the individual's recovery of function, information derived from functional activities performed in the individual's home environment is especially valuable. This value is further increased when the information is expressed, at least in part, in quantitative and objective terms.

The challenges for this project include identifying and applying technology for monitoring in order to meet the following specifications:

  • Acceptability to consumers, so that privacy is not violated;
  • Unobtrusiveness, to avoid interference with individual's daily activities and artifactual effects on the data; and
  • Interpretability and validity, so that the data acquired are useful to the rehabilitative process and can be analyzed automatically within a theory-based framework.
  • Physical and Occupational Therapists and Speech-Language Pathologists are working with Rehabilitation Engineering research staff in order to address these questions.

A live demonstration of functional performance monitoring will be provided using unobtrusive sensors installed in a kitchen, in this case the kitchen in the Independence Square® at the National Rehabilitation Hospital. The demonstration will include the prototype information display designed for clinicians viewing performance data, featuring integration of digital photographs with performance timing data, incorporation into report production, and comparison with data from previous monitored events.

Later phases of this project will involve placement of trial systems into homes of recovering individuals in the greater Washington area, and then in rural homes in Minnesota.


The goal of this project is to extend the utility of home videophone delivery of rehabilitation services by the addition of three novel features:

  1. Image Mobility, i.e., user-friendly mobility and adjustability of the at-home camera, video monitor and lighting, or, alternatively, multiple fixed cameras, to permit individuals at home to bring the remote professional anywhere they choose in their home and demonstrate problems or try out adaptations;
  2. "ADL demonstration studio" capability in National Rehabilitation Hospital's Independence Square®, to provide rehabilitation therapists with the image mobility they need to demonstrate how to manage tasks in a home environment; and
  3. Touch interaction provided by means of a transparent touchscreen mounted on the home video screen and graphic and real-world images transmitted from the hospital computer to the home TV as assessment tasks and communication menus.

Two-way videotelephone interactions and examples of touchscreen assessments in home rehabilitation will be demonstrated.



The social deficit appears to be the most significant of the core deficits of autism (Fein et al., 1996). Deficiencies in face gaze may play a role in children's social impairment (Trepagnier, 1996). One of the goals of this project is to acquire data that bears on that hypothesis.

A number of studies have revealed differences and deficiencies in looking at and interpreting faces on the part of persons with autism (e.g., Hobson, 1986; Tantam et al., 1989). In tasks of identifying people and identifying expressions of emotion, non-disabled controls did best when they could see the upper face. In contrast, the performance of most persons with autism did not show that effect (Langdell, 1978). Similarly, individuals with autism fail to interpret the meaning of others' gaze direction (Baron-Cohen et al., 1997). These findings are consistent with a theory that posits failure to have experienced establishment of species-typical face processing in early infancy, when the visual system is immature, and can access only configurational information (Trepagnier, 1996). No studies have yet been carried out that directly document face gaze by persons with autism.

Left Neglect

Persons with right hemisphere stroke are particularly vulnerable to disruption of perception of the left side of space. This phenomenon is more noticeable clinically than is right neglect from left hemisphere lesion. One interpretation of the greater salience of left neglect than right neglect is that the right hemisphere is specialized for global perception (Marshall & Halligan, 1994). After stroke on the right side, not only is detail less readily perceived in the part of space opposite to the lesion site, but global perception overall is disrupted.

Faces are generally looked at by non-disabled persons with special attention to the eye area. This may be adaptive, since focusing on the eye area provides maximal information about the relative positions (the configuration) of the rest of the features, including, for example, an upturned corner of the mouth, or a furrowed brow. One of the questions in this study is what happens to stroke patients' face gaze when the face stimuli are rotated so that the informative portion is in the area to which the individual does not attend.

This project's long-term goal is the development of targeted, cost-effective techniques for evaluating and intervening in anomalous visual behavior present in these two different populations. The first stage of this undertaking is the investigation of gaze behavior in response to static presentation of three-dimensional faces and objects, in a recognition memory task. A series of faces and objects will be shown, one at a time, using a head-mounted display (HMD). Following this, these stimuli will be displayed again, interspersed with novel faces and objects. Participants are asked whether each stimulus has been seen before ornot. In addition to recording response latency (using manual switches) and accuracy, the location of point of regard is sampled 60 times per second, using a corneal-reflection and pupillary-reflection gaze angle sensor which is installed in the HMD. Face stimuli vary in gender, emotional expression and presence of accessories (e.g., earrings); and objects vary in symmetry.

If indeed persons with autism are looking at faces in an atypical, non-optimal way, it may be possible to improve the efficacy of therapeutic interventions by incorporating training in looking behavior.Virtual reality presentation, which excludes extraneous svisual distractions and permits presentation of more realistic, three-dimensional-appearing stimuli, may become a useful adjunct to diagnostic and monitoring techniques, and a medium for delivering therapeutic interventions, for both of these populations.

Visual records of face gaze by non-disabled individuals and persons with stroke and with autism will be demonstrated as part of the presentation, as well as visuals of the equipment and stimuli.

Future stages of this study will include construction of dynamic environments for investigation of visuomotor behavior as the participant navigates through a social and physical environment.


Baron-Cohen, S., Wheelwright, S., & Jolliffe, T. (1997). Is there a 'language of the eyes'? Evidence from normal adults and adults with autism or Asperger syndrome. Visual Cognition 4(3), 311-331.

Fein, D., Pennington, B., Markowitz, P., Braverman, M., & Waterhouse, L. (1986). Toward a neuropsychological model of infantile autism: Are the social deficits primary? Journal of the American Academy of Child Psychiatry 25(2), 198-212.

Hobson, R. (1986). The autistic child's appraisal of expressions of emotion: A further study. Journal of Child Psychology and Psychiatry 27(5), 671-680.

Langdell, T. (1978). Recognition of faces: an approach to the stud of autism. Journal of Child Psychology and Psychiatry, 19(3), 255-268.

Marshall, J. C., & Halligan, P. W. (1994). Independent properties of normal hemispheric specialization predict some characteristics of visuo-spatial neglect. Cortex 30, 509-51.

Ogura, T. (1991). A longitudinal study of the relationship between early language development and play development. Journal of Child Language 18, 273-294.

Stanton, D., Wilson, P., & Foreman, N. (1996). Using virtual reality environments to aid spatial awareness in disabled children. Proc. First European Conference on Disability, Virtual Reality and Associated Technology, Maidenhead, UK.

Tantam, D., Stirling, S., Monaghan, L., & Nicholson, H. (1989). Autistic children's ability to interpret faces: a research note. Journal of Child Psychology and Psychiatry 30, 623-630.

Trefler, E., & Cook, H. (1986). Powered mobility for children. In E. Trefler, K. Kozole & E. Snell (Eds.), Selected Readings on Powered Mobility for Children and Adults with Severe Physical Disabilities. Washington, DC: RESNA Press.

Trepagnier, C. (1996). A possible origin for the social and communicative deficits of autism. Focus on Autism and Other Developmental Disabilities 11(3), 170-182.


The authors gratefully acknowledge U.S. Army Medical Research and Materiel Command support for the Assistive Technology Research Center at National Rehab Hospital (NRH), and National Institute on Disability and Rehabilitation Research funding of the Rehabilitation Engineering Research Center on Telerehabilitation at Catholic University of America, NRH and Sister Kenny Institute.