音声ブラウザご使用の方向け: SKIP NAVI GOTO NAVI

HUMAN-COMPUTER INTERACTION WITHIN ROBOTIC WORKSTATIONS

John L Dallaway Department of Engineering University of Cambridge, UK

ABSTRACT

This paper describes recent research in human-computer interfaces for the command of rehabilitation robot systems. An iconic interface providing drag and drop interaction has been developed. This interface integrates with the Cambridge University Robot Language (CURL) to facilitate intuitive operation of structured robotic workstations by users of pointing input devices. The evaluation of this interface is presented in the context of experimental work with a number of alternative user interfaces.

BACKGROUND

CURL is an established interactive robot control environment which has recently been commercialised [1]. The CURL environment facilitates both direct control and task-level control of a robot arm and has been specifically designed for applications in rehabilitation. The software runs under Microsoft Windows and embodies the standard look and feel of Windows applications. It also incorporates a Dynamic Data Exchange (DDE) server which allows robot command to be initiated from other application software running on the Windows desktop. A number of software-based access products are now available for Windows. These products enable non-keyboard users to gain full control of all applications on the Windows desktop using alternative input devices. It has not, therefore, been necessary to make provision for non-keyboard users within CURL itself. Results from earlier research have indicated a requirement for user interfaces more closely tailored to the tasks of a specific robot installation [2]. Others have observed that shortcomings in the user interface can act as major deterrents to the widespread adoption of assistive robotic devices [3]. The work described in this paper has arisen from the search for a rapid development and evaluation strategy to facilitate the implementation of alternative customised user interfaces.

AN ICONIC INTERFACE

The benefits of graphical user interfaces which facilitate the direct manipulation of data are now widely acknowledged. Substantial research in the area of generic direct manipulation interfaces has demonstrated the following beneficial concepts [4]:

  • Continuous representation of objects and actions of interest
  • Use of labelled buttons rather than a complex syntax
  • Incremental operations which are readily invoked and have an immediate effect

In designing a iconic user interface providing drag and drop interaction, the author has attempted to realise these benefits in the context of robot command. The prototype interface was developed using Microsoft Visual Basic 3.0 running under Microsoft Windows 3.1. This programming environment is well suited to user interface development in that visual components may be assembled rapidly and evaluated `on screen' prior to the coding of underlying actions. The interface allows each significant object within a robotic workstation to be represented by an icon within an interface window. The icons may be arranged to replicate the spatial relationships existing between the objects they represent. Each icon has an associated caption which is displayed as text immediately below the image. In addition, an image may be displayed behind the icons to clarify their meaning as necessary. The facility to drag an icon and drop it over another icon may be enabled on an individual icon basis. Robot tasks may be associated with specific combinations of drag and drop operations by entering CURL commands into a command matrix. Further tasks may be associated with the action of double clicking on the icons. Commands are sent to CURL using Dynamic Data Exchange.

INTERFACE EVALUATION

An evaluation of the prototype iconic interface was undertaken with the assistance of five potential users of rehabilitation robot technology. Alternative commercial user interfaces were also evaluated to facilitate a comparison of the current interface technologies. The Windows Visual Keyboard (WiViK) Scan 2.1 was used, both with a pointing input device and with a single switch input device [5]. The Voice Pilot utility from the Microsoft Windows Sound System 2.0 was also employed. This utility provides keystroke emulation using discrete utterance speaker dependent speech recognition. The trial participants were invited to perform a cognitively simple task which involved the transfer of pegs between holes and racks in a specified sequence. This task was based on that used within the Interactive Robot Quantitative Assessment Test (IRQAT) [6]. Each peg transfer was performed using a pre-programmed CURL procedure to command an unmodified RTX robot arm. The task was repeated four times by each participant, once for each of the four user interfaces. None of the trial participants were able to perform the task unaided. Each participant selected an appropriate pointing device with which to interact with the prototype iconic interface and the WiViK interface. The button of a IBM-compatible games port joystick was used for switch input to the WiViK interface. Default interface configuration parameters were used wherever possible. Voice Pilot was trained using three examples of each valid utterance. Templates which were found to be unreliable were retrained as necessary. Visual elements of the user interfaces are shown in figure 1. In the case of the iconic interface, peg transfer was initiated by dragging one of the peg icons and dropping it over a rack or hole icon. With the WiViK interface, the user first selected a peg object. The macro keyboard was then replaced by a second keyboard from which a rack or hole destination was selected. Appropriate utterances were displayed by the Voice Pilot interface as a command sentence was formed.

Figure 1. Visual elements of the user interface

Figure 2. Interface evaluation results

RESULTS

A questionnaire was completed by each participant at the conclusion of the trial. The questionnaire invited comment on the relative merits of each user interface under evaluation. The standard CURL log facility was used to record the time at which each interaction with the user interface occurred. The resulting log file was analysed to extract the time taken to initiate each peg transfer and the number and nature of any user errors. Those timings which were clearly in error due to interruptions in the evaluation procedure were discarded. The mean values of the remaining data are presented graphically in figure 2. A different shading pattern is used to represent the data of each trial participant. Four out of the five trial participants stated that the iconic interface was both the most easy to understand and their interface of preference. These participants used a variety of pointing devices including a mouse, a trackball and a headmouse. The other participant was unable to use the iconic interface due to input device compatibility problems. No errors were made by any of the users when working with either the iconic interface or the WiViK interface in conjunction with a pointing input device. Errors were made when using WiViK Scan with a switch input device. The incidence of these errors was reduced as the users became more accustomed to the scanning speed. Command errors occurring with the Voice Pilot interface were due entirely to misrecognition of utterances. The limited duration of each trial session did not permit this interface to be fully optimised. However, the potential benefits of this form of interaction were evident to the majority of trial participants.

CONCLUSIONS

The results indicate that the prototype iconic interface is an acceptable and efficient interface for robot command. The continuous representation of objects and actions of interest was assessed to be beneficial by all participants. They also appreciated the use of icons to reinforce the meaning of each object and the ability to replicate the spatial relationships of objects within the interface. Further work is necessary to extend the flexibility of the iconic interface for use with more complex robotic workstations. There is also potential for the use of iconic interfaces in conjunction with switch input devices. Extended trials of the existing prototype interface are planned using the RAID2 robotic workstation [7] within an office environment.

ACKNOWLEDGEMENTS

This work has been funded by the Leverhulme Trust (ref F/452/B) and the European Commission TIDE programme (ref TP1024). Additional support and trial facilities were provided by the Papworth Group, UK and AmuHadar, Sweden. The author acknowledges the substantial contribution of the late Robin Jackson.

REFERENCES

[1] Dallaway JL, Mahoney RM, Jackson RD, Gosine RG (1993) An interactive robot control environment for rehabilitation applications. Robotica. 11. 541-551.

[2] Dallaway JL, Jackson RD, Mahoney RM (1994) The user interface for interactive robotic workstations. Proceedings of the 1994 IEEE/RSJ International Conference on Intelligent Robots and Systems. 1682-1686.

[3] Leifer L (1992) RUI: factoring the robot user interface. RESNA 92 - Proceedings. 580-583. [4] Shneiderman B (1992) Designing the user interface. Addison-Wesley.

[5] Shein F, Hamann G, Brownlow J, Treviranus J, Milner M, Parnes P (1991) WiViK: A visual keyboard for Windows 3.0. RESNA 91 - Proceedings. 160-162

. [6] Mahoney RM, Jackson RD, Dargie GD (1992) An interactive robot quantitative assessment test. RESNA 92 - Proceedings. 110-112.

[7] Danielsson C, Holmberg L (1994) Evaluation of the RAID workstation. Proceedings of the Fourth International Conference on Rehabilitation Robotics. 7-11.

Dr JL Dallaway Department of Engineering University of Cambridge Trumpington Street CAMBRIDGE CB2 1PZ United Kingdom

Human-Computer Interaction