音声ブラウザご使用の方向け: SKIP NAVI GOTO NAVI

Web Posted on: August 24, 1998


MAN/MACHINE INTERACTION WHEN
USING AN ARM ROBOT BY PEOPLE WITH SEVERE DISABILITY

M. Mokhtari
INSERM-CREARE, (U.483)
University Pierre & Marie Curie
9, Quai Saint Bernard
75005 Paris
and
Institut National des Télécommunications
INT
9, Rue Charles Fourier
91011 Evry Cedex

Tel: 33-1-44-27-26-24
Fax: 33-1-44-27-34-38
E.mail. Mounir.Mokhtari@snv.jussieu.fr

N. Didi
B. Grandjean
A. Roby-Brami

INSERM-CREARE
(U.483)
University Pierre & Marie Curie
9, Quai Saint Bernard
75005 Paris

I. Laffont
Rehabilitation Hospital of Garches
Hôpital Raymond Poincaré
104, Bd. Raymond
Poincaré 92000 Garches

 

Abstract:

The aim of our research work is to favor the accessibility and the adaptation of assistive devices based on new technologies for people with severe motor impairment.

This project is based on the use of the MANUS II arm robot, manufactured by Exact Dynamics company in Netherlands, which is mounted on an electrical wheelchair and permit to telemanipulate objects in an open environment.

We have developed an Assistive Control System (ACS) to improve the manipulation of the Manus robot. Two more modes have been added to the actual command architecture: the Point-to-Point control Mode (PPM) that allows semi-automated movement of the end-effector and the Record Mode (RM) that allow saving on-line gripper positions to perform automated tasks in the open working space of the robot.

Moreover, 3D recording system, based on electromagnetic sensors, is used to facilitate sensory-motor learning by providing an audio-feedback to the user during the manipulation. Our aim is to improve the learning environment of the Manus.

This paper describes a new approach based on a quantitative evaluation of the user activity when using a Manus robot to perform daily living tasks and the improvement developed to favor the use of Manus robot at home.



| Top |

1. Introduction

Our study is based on recent motor control theories of sensory-motor learning to develop quantitative evaluation methods in order to analyze the different factors which could influence the human-machine interaction : the human factor, the machine and the type of the interface. The quantitative evaluation enables to answer questions as: How long the Manus is used daily ? Which kind of tasks are managed by the robot ? How long does it take to perform a specific task? What is the strategy used to reach a goal ?

The main objective and final outcome of our project is the integration of a reliable and friendly arm robot mounted on a wheelchair at home of severely disabled people that lost the faculty to use own arm to perform daily living tasks.

User needs analysis of disabled people are highlighted by a quantitative evaluation of the upper arm movement (working space of the upper arm) which has been developed on the base of a 3D recording system (Spatial Tracking System: STS) [5].

The aim is to compare the user action on the input device (joystick, mouse, keypad, switch, etc.), and the result obtained by the system during the processing of an action toward a goal. We record on-line both the set of input commands via the CAN bus of the robot as well as the result obtained in term of gripper trajectory by the use of the STS.

Audio feedback is used to facilitate motor learning.



| Top |

2. Command architecture of Manus II

The Manus II robot is an evolution of Manus I, with improved design, hardware and software. The Manus is a six degrees of freedom (DOF) arm with a symmetrical gripper. Manus is mounted on an electrical wheelchair. The actual command architecture includes three pre-set mapping control modes: Main Mode (MM) that allows to fold in and fold out the arm and perform the drinking action, Cartesian Mode (CM) designed to control the position and the orientation of the end-effector, and the Joint Mode (JM) which allows a direct control on each joint of the arm independently.

We developed the new architecture on the base of Borland C++ under MSDOS as a high control layer [2]. The original keypad of the Manus is connected to the Manus control box via a PC based interface. All user actions on the keypad are interpreted by a high level control unit which permits, in one hand to record the action in a logfile for the quantitative evaluation, and on the other hand to send the according command to the Manus via the CAN bus (Fig.1).

Figure 1: Instrumentation of the Manus II

 

The on-line recording of the commands permit to quantify the user action and then to get precise numerical data during the learning phase or during skilled use of the Manus.

This software offers a new control organization of the Manus and includes the original command architecture provided with the robot (Fig. 2).

 

Figure 2: New Manus II architecture

 

The Point-to-Point mode (PPM) allows to perform a global movement between the current position of the end-effector and a defined position in the workspace of the Manus. Each point Pf represents a given robot arm configuration and is defined by the gripper position (xi, yi, zi) and orientation (yawi, pitchi, rolli). Thus each gesture Gi performed by the robot corresponds to a global movement that brings the end-effector from any arm workspace position Pi to the predefined position Pf. Several positions have been implemented and allow the user to reach an object by choosing the nearest pre-defined position. The final gripping phase is made manually using CM.

The Record mode (RM) gives the user the possibility to record different arm configurations. This allows the user to perform a repetitive task. It can be first performed by combining the CM and the PPM. Once the target is reached the user can record the gripper operational coordinates (position and orientation) and assign it to one of the keypad buttons. Later on, and from any gripper position and only if the user has not moved his wheelchair, he can get back, using the PPM, to the recorded configuration. This facility intends to give the possibility to the user to take an object anywhere then to put it back automatically after use.

The desired global movement of the arm could be stopped at any time if the corresponding button is released. The user has the ability to deflect the trajectory of the end-effector during the movement. This approach fits the shared control concept between the user and the robot [1].



| Top |

3. Quantitative user evaluation method

 The get a precise idea about the contribution of an arm robot in the daily life of disabled people, we developed a quantitative evaluation method when using an arm robot. The method consists on detecting and memorizing all user actions on the keypad of the Manus and their timing.

In addition to the PPM and RM modes, a Replay Mode (RPM) has been implemented in the command architecture. It allows the evaluator to replay automatically and off-line the sequence of actions performed previously by the end-user.

3.1 Evaluation of the learning phase

 The first phase of analysis was on the evaluation of the learning phase of the Manus. We have considered only the basic command architecture of the Manus. Six quadriplegics users from the rehabilitation hospital of Garches (4 with spinal cord injuries and 2 with muscular dystrophy) participated in the experimentation.

Four tasks has been performed by each user (parts of gripping and drinking task). The figure 3 gives part of the sequence of actions (arbitrary coded from 0 to 30) and according modes (arbitrary coded from -3 to -6) to perform a specific task.

Figure 3: Representation of the different modes and actions performed by a quadriplegic patient

To evaluate the learning phase of the Manus we have considered the histograms representation (Fig. 4), where Total is the time to perform the whole task, Rest is the total rest which represents the time between two actions on the keypad, T-Cart is the displacement time of the end-effector and T-Grip the orientation time including the opening and the closing of the gripper.

Figure 4: Analysis of the learning phase of Manus by quadriplegic patient

3.2 Evaluation of the Assistive Control System

 Preliminary results obtained with the participation of six able-bodied people well familiarized with the ACS modes showed the time gained to perform a complex task "serve and drink" (Fig.5). The use of pre-programmed gestures implemented in the command architecture of the Manus leads to less user actions.

Figure 5: Evaluation of the ACS (total task duration T-Manip1: with CM only, T-Manip2: with the ACS).

 

As part of this project, we were interested in the comparison of two types of user control for Manus robot. The first experimentation is one-keypad based as mentioned above, whereas in the second the control is shared between two keypads. In one hand a four buttons keypad is used to switch between the different modes and in the other hand the 4x4 keypad is used for enabling coresponding actions. A first evaluation was performed with able-bodied people, and this protocol is under evaluation with the help of disabled people in the rehabilitation hospital of Garches.



| Top |

4. Learning environment

 Sensory-motor learning in very important in the evolution of the disability and in the rehabilitation process [6]. We have considered the assumption that the learning would be facilitated if we increase the quantity of information available. For this purpose we have developed a system for generating auditory information of the movement that we called ‘Auditory Biofeedback’. It could be used in the context of rehabilitation or in the learning of complex commands and of assistive aids. In our case we have considered the auditory biofeedback for learning telemanipulation when using the Manus arm robot.

The 3D motion analysis was performed with the Spatial Tacking System (STS) which makes use of electromagnetics fields (Polhemus) to provide the position and orientation of the sensors reference frames relative to stationary axes.

The experimentation consists on the evaluation of a new auditory method to favor the gripping of an object by an arm robot. A first STS marker is fixed at the extremity of the robot end-effector, with care of electromagnetic perturbation due to the metallic structure of the arm, and a second STS marker is fixed to the object. The frequency of the generated sound varies as a function of the distance between the gripper and the target (Fig. 6). The amplitude parameter of the stereo sound is also utilized: the balance between right and left channels depends on the target. Amplitude is equal in both ears when the gripper is in the axis of the target (Fig. 7). This method permits to locate the object by the only use of audition information.

Figure 6: Frequency variation of the sound according to the distance

Figure 7: Amplitude (arbitrary units) variation of the sound according to the position of the target

 

This auditory method has not yet been evaluated with the participation of disabled people. A first trial performed with the help of able-bodied people has demonstrated the feasibility of the method. The results obtained showed that often vision dominated audition, because the subject were authorized to move their head during the experimentation. In this context it is difficult to evaluate the benefit of audioguiding method. By opposition, disabled patients have reduced head movements that cause alignment problems when using the gripper. So, the auditory biofeedback should assist them during the reaching and gripping phase.



| Top |

5. Conclusion

 In this paper, we have tried to investigate the factors which could influence the man-machine interaction in the case of persons presenting severe motor impairments (quadriplegics). This analysis is the first step of a technological development phase aiming at improving assistive technology for disabled people to offer them a better quality of life and a better social and professional integration. To this purpose, we based our approach on current theories of the action mechanism, motor control, and sensory-motor learning which guided our developments. We have started to validate these tools and we obtained original results on the use and learning of an assistive aid which permitted us to delimit the determinants of the man-machine interaction: the human, the machine and the nature of the interface.

Our contribution consists in offering to the evaluators a method which gives quantitative results, by the direct analysis of the man-machine interaction. We have showed with an example based on the use of Manus arm robot that the quantitative evaluation gives precise data on the way a robot is used and focuses on the problems encountered during the learning phase. An evaluation in real condition at home is planned. The aim is to identify which daily tasks are mainly performed with the help of Manus.

The benefit of the auditory biofeedback is mainly dedicated to disabled people having reduced motor disability. Objects located at the periphery of the field of vision could be reached by an arm robot thank to the auditory guiding which has been developed.



| Top |

Acknowledgment

The authors would like to thank C. Ammi from the INT, J.C. Cunin and C. Rose from the French Muscular dystrophy Association (AFM), H. Stuyt and H. Evers from Exact Dynamics, for their help and support. Funds for this project are provided by the AFM and the Institut Garches.



| Top |

References

[1] R. Chatila, P. Moutarlier, N. Vigouroux, "Robotics for the Impaired and Elderly Persons", IARP Workshop on Medical Robots. Vienna, Austria. 1-2 Oct 1996.

[2] N. Didi, M. Mokhtari, A. Roby-Brami, "Méthode d'assistance à la télémanipulation de robots pour les personnes lourdement handicapées" Colloque INSERM-INRIA. INRIA Centre de Sophia Antipolis, France. 11-12 Dec 1997.

[3] Kwee H., "Integrated control of Manus and wheelchair". In Proc. ICCOR’97. Bath Uni., UK.April 1997. P91-94

[4] M. Mokhtari, A. Roby-Brami, I. Laffont, "A Method for Quantitative User Evaluation in case of Assistive Robot Manipulation". In. Proc. RESNA’97. Pittsburgh, Pennsylvania. June 20-24, 1997. P420-422.

[5] M. Mokhtari, A. Roby-Brami, "Quantitative Evaluation of Human-Machine Interaction when Using an Arm Robot". Submitted RESNA’98. Minneapolis, Minnesota. June 28-July 2, 1998.

[6] Roby-Brami A., Burnod Y., "Learning a new visuomotor transformation: error correction and generalization". Cognitive Brain research, 1995, 2: 229-242.



| Top | | TIDE 98 Papers |