音声ブラウザご使用の方向け: SKIP NAVI GOTO NAVI

Web Posted on:January 7, 1999


Towards an improvement of manus robot to meet the needs of disabled people

M. Mokhtari
1 Institut National des Teleommunications
Evry cedex, France
2INSERM-CREARE (U.483)
and
University Pierre & Marie Curie
Paris, France

N. Didi
2INSERM-CREARE (U.483)
University Pierre & Marie Curie
Paris, France

A. Roby-Brami
2INSERM-CREARE (U.483)
University Pierre & Marie Curie
Paris, France

Abstract

This paper describes a multi-disciplinary method which permits, both the analysis of the usefulness of an assistive aid by a quantitative evaluation, and the development of a new Assistive Control System (ACS) to improve the Manus robot. In the first part we present the quantitative evaluation method when using a Manus robot at home of disabled people involved in this experimentation, and in the second part the technological improvement integrated to the Manus to meet the needs of disabled users.


Introduction

The persistence of problems faced by the disabled people in the field of assistive technology is not due to an under-estimation of the ergonomic constraints (which are now recognized as being essential in the improvement of the man-machine systems), but to the fact that the ergonomic methods are too qualitative to be effectively used [7]. The aim of our research work is to develop a quantitative approach which provides readable data to the evaluators and is complementary to the qualitative methodology.

During a preliminary evaluation with quadriplegic patients in the Rehabilitation Hospital Raymond Poincar - of Garches, near Paris, even though the technology supplied by the available systems (such as environmental control systems, Manus robot, Master-Raid workstation, etc.) could often respond to the needs of disabled users (that is, access to home appliances, approach objects with Manus, drinking and insert video a tape with Master, etc.), it was found that disabled users frequently encountered difficulties in controlling these assistive devices. The ergonomic analysis of the situation showed that most of the problems were situated at the level of man-machine interface [4][5].

Manus, a six Degrees Of Freedom (DOF) teleoperated robotics arm mounted on a wheelchair, is presently commercialized by Exact Dynamics company in Netherlands. The French Muscular Dystrophy Association (AFM) has introduced fifteen Manuses in France to help the disabled people to get in touch with such technology. The main advantage of the Manus is that it can perform tasks in non-structured environment which correspond, in general, to the real environment of the end-users. However the manipulation of such a assistive device by the disabled is not always obvious [4].

This paper describes a new approach based on a quantitative evaluation of the user activity when using a Manus robot to perform daily living tasks and the improvement developed to favor the use of this robot at home.


Methodology and Objectives

Our goal was to analyze the origin of problems faced by disabled users within the use of direct manipulation interfaces. As such, it is necessary first to identify the repetitive tasks performed in daily living by an assistive aids in order to be integrated as high priority tasks. In the case of a robotics arm the most important task according to each end-user must be easily accessible and automated. For example the task : gripping object from a shelf is always requested by the users, so it is possible to define in the command architecture of the robot a semi-automated gesture of the robotics arm which brings the gripper in some defined positions near the shelf.

This approach is designed to study the man-machine interaction at two levels:

  1. The result of evaluation on Manus robot is primordial to analyze the acceptance of such a device by severely disabled people. Many evaluations has been performed with this robot by occupational therapists with patients in different institutions [3], but none of them in real conditions at home of disabled. To reach this goal we developed a quantitative evaluation method which permit to get accurate data on the evaluation at home without the permanent presence of the evaluator at home. We have developed a PC-based system fixed on the back of an electrical wheelchair which permit to record on-line all the actions performed on the input device to control the Manus robot and then detect by feedback from the robot the movement that is currently performed.
  2. The ACS we are proposing provides a semi-autonomous controller for Manus that will lessen the number of mundane tasks (by preprogramming commonly used gestures) while still enabling the user full control on the robot. We consider that the safety and security aspect of any assistive device is the highest priority in this field. Thus we designed the ACS with respect of M3S (Multiple-Master-Multiple-Slave) recommendations. M3S, which is based on a CAN (Controller Area Network) communication bus, is mainly dedicated to the assistive technology field [9]. It is an open, modular and safe bus system where the input devices (such as joysticks, keypad, etc...) are independent from the controlled system (wheelchair, robot, environmental control systems, etc.), and any peripheral input can control several output systems.

Quantitative User Evaluation method (QUE)

The first step of the experiment with the Manus was to valid our method on site with the participation of six quadriplegic users, 4 with spinal cord injuries and 2 with muscular dystrophies, before starting the evaluation at home. We have fixed the Manus near a working place and asked novice users to follow a precise scenario of tasks involving different objects and several positions in the working space of the robot (Fig.1). All the data were recorded on line and analyzed afterward [1]. This short term evaluation period, of about half an hour for each subject, permitted to develop tools based on Matlab' , Kronos' and Excel' softwares to analyze the different parameters.

The second step consisted on the installation of the Manus robot on different wheelchairs of end-users who accepted to participate in this experimentation. This was done thank to the support of the French muscular dystrophies associations (AFM) which bought five Manus robots to this purpose. Four robots are actually used at home by people having a quadriplegia due to muscular dystrophies.

A laptop PC was fixed on the back of the wheelchair. Therefore each command sent to the robot is detected, interpreted by a supervisor software, and stored as ASCII files including the time parameter.

It is important to notice that the evaluation at home has some constraints we have to deal with. The evaluation is based on the motivation and the availability of the person at home. Unfortunately it is not the case, the disabled users involved in this evaluation are very busy, they are at school or in the rehabilitation center the whole day. So the idea was to have a long term evaluation period covering several days without any intervention of the evaluator. At the end of each week an ergonomist visits the user at home to recover the data and review the feeling of the user regarding this technology.

Example:

During an evaluation at home of a person having muscular dystrophies that implied a quadriplegia of her four limbs, we have considered a specific task such as griping the remote controller of the TV and approach it near the hand by the only use of a keypad to control the robotics arm. We remarked that more than 70% of the task duration (9.21 minutes) corresponded to the rest time. In figure 2 we represent the total number of key pressings (Total Actions), the number of actions performed by the robot (RA) and the number of time between two consequent actions (No Action) which correspond to the rest time or the time needed to decide/find the desired action. We considered also the number of time a warning message occurred during a specific task. A warning message means that the robot reached the limits of its working space and any action pushing the arm outside this working space is considered as a warning action. Preliminary results on transition time between modes and actions showed that some actions which belong to different modes must be available in several modes to minimize the switching between modes. A first proposition on the organization of modes and actions on the keypad is actually available. This type of information is useful for the ergonomic design of the keypad interface.

Figure 2

Figure 2: Example of a task performed by the use of Manus robot The ACS Hardware AND SOFTWARE Organization As describe in [2] the ACS is developed on a PC. It supports as inputs, a 16-digit keypad, a 3D joystick, and trackball. The trackball require a screen that displays a user-friendly Graphic Interface (GUI), this screen represents a keypad that helps the user to send commands to Manus, via the CAN bus, by clicking on buttons. Users can also find, Manus status and warnings messages that are helpful during manipulation. This GUI can be used, for example, during the user learning period. The software is compatible with Manus II (this is an improved design and hardware version of the first Manus that offers a CAN plug-in input) and is organized as shown in Fig.2. The command from the user is translated by the IDATM, according to the input device used, to a standard code that is processed by the HCM. The PPM generates the adequate trajectory to be followed by the robot end-effector The communication between the MCM and HCM is doneby the two CAN modules PC-CTM and PC-CRM.


The Gesture Library

The gesture library contains a set of generic global gestures that help disabled people to perform complex daily tasks. These gestures correspond to only a portion of any particular task. Each gesture (Gi) is characterized by an initial operational variable of the robot workspace (Oii) corresponding to the initial robot arm configuration and a final operational variable (Oif) corresponding to the final robot arm configuration. Each variable (Oi) is defined in the Cartesian space by the gripper position (xi, yi, zi) and orientation (yawi, pitchi, rolli). The gestures generated by our system are linked only to the final operational variables. The path planner is able, from any initial arm configuration, to generate the appropriate trajectory to reach the final configurations. We have prerecorded twelve final operational variables as describe in [2] and allow the user to record two others.

  • : Input Device Actions Translator Module
  • LRFM: Log and Replay Files Module:
  • GUI: Graphic User Interface
  • PPM: Path Planner Module
  • HCM: High control Module
  • MCM: Manus Control Module
  • PC-CTM: PC Can Transmission Module
  • PC-CRM: PC Can Reception Module

Figure 2: The ACS hardware organization

The ACS MODES organization

In addition to the Cartesian Control Mode (CCM) and the Joint Control Mode (JCM) (the first one allows the user to control manually the arm and gripper motion in Cartesian space whereas the second one allows a direct and separate control of the six arm joints) existing in the commercialized version of Manus, the ACS offers three other modes that we have called: the Point-to-Point Control Mode (PPCM), the Record Mode (RM) and the Replay Control Mode (RCM). Fig.4 shows the ACS modes organization. The gestures from the library described above are activated by the user in the PPCM. In this mode, each button of the keypad generates a gesture following the keypad mapping showed in fig.5. The 3x3 matrix of pre-set buttons correspond to nine pre-set configurations of the robotic arm, following a vertical grid front of the user. For example, if the user wish to reach a target which is in the left (left side of the robot) down position he/she may push the button "DL" that will bring the robot end-effector towards this position. The button "OD" will generate a gesture towards an arm configuration that will allow the user to open a door or grasp an object from the top, the button "FL", will generate a gesture to grasp object from the floor, the button "US", is a back gesture towards the user, and the buttons "P1" and "P2" will generate gestures towards two user pre-recorded robot configurations. These two robot configurations are recorded in the RM. The RCM, which is not accessible from the user input device, will allow, for example, evaluators to replay off-line, a saved sequence of actions performed previously by the disabled patient. Figure 3: Representation of the two robot configurations that characterize the gesture. Figure 4: The ACS modes organization Figure 5: The keypad pre-set mapping in the PPMC


Conclusion

The main objective of our research work is to explicit the different factors that could influence the man-machine interaction in the case of people having severe motor disabilities regarding assistive technology. It will permit to guide technological development in the field of assistive technology. In order to reach this goal, we have developed a multi-disciplinary method based on recent theories on action mechanisms, motor control, and sensory learning. Our contribution consists on providing to the evaluators a method which permits to quantify the information of the evaluation by providing accurate data that could be effectively used.

Preliminary result have been obtained, with patient having motor disabilities, allowed us:

  1. to get original data on the use of robot at home of people having muscular dystrophies and on problems encountered during normal use of such a complex system.
  2. to bring some improvement to the system. The first trials with disabled patients showed their interest regarding the ACS. The result obtained, are only preliminary results, and we can not yet pronounce on the real contribution of the new ACS modes in the execution of complex tasks.

This multi-disciplinary method could be extended to a general methodology to predict which types of assistive aids and which types of user interfaces could best fit the needs of disabled people.

This research work initiated the European Commanus project (Biomed-Craft DG VII), starting in November 1998 for two years, which implies deep hardware and software organization of the new Manus which will be available on the market. More specifically our study has as main objective in this project to produce knowledge on the real functioning of the man-machine interaction to highlight future decisions on the developments. This method will be used for an iterative evaluation process during this project.


Acknowledgments

The authors would like to thank C. Ammi from the INT, and J.C. Cunin and C. Rose from the AFM, for their help and support. Funds for this project are provided by AFM grant, INSERM and INT.


References

[1] Boadella J, "Evaluation of Manus robot at home". Technical report. INSERM-CREARE (U.483). September, 1998.

[2] N. Didi, B. Grandjean, M. Mokhtari, A. Roby-Brami, "An Assistive Control System to the manipulation of the Manus arm robot", RESNA 98, P289-291, Minneapolis, Minneapolis, Minnesota. June 1998.

[3] Leclaire G., "Preliminary results on the readaptative evaluation of Manus II". Internal report APPROCHE association. France. April, 1997.

[4] Mokhtari M., " Contribution to the installation of an evaluation site for severely disabled people". Master research thesis, Institut National des Télécommunications. June 1994.

[5] M. Mokhtari, N. Didi,A. Roby-Brami, "Quantitative Evaluation of Human-Machine Interaction when Using an Arm Robot", RESNA'98, P289-291, Minneapolis, Minnesota. June 1998.

[6] Mokhtari M, Didi N, Grandjean B, Roby-Brami A, Laffont I, "Man-Machine interaction when using an arm robot by people with severe disability". IOS Press, Assitive Technology Research Series, vol. 4. DG XIII European Commission. June 1998. P228-233.

[7] Senach B., "Ergonomics evaluation of the human-machine interfaces: review of litterature". Research report. INRIA. Prog. 8 Human-Machine Communication. N°1180. Mars 1990.