音声ブラウザご使用の方向け: SKIP NAVI GOTO NAVI

Design of a Haptic Graphing System

Jason P. Fritz, Kenneth E. Barner

Applied Science and Engineering Laboratories,

University of Delaware / Alfred I. DuPont Institute

Abstract

Interpretation and understanding of complex scientific data is enhanced by graphic representation. Haptic display of such graphical data adds kinesthetic feedback that further improves, or allows conceptualization. In this paper, we present a method for the haptic display of two dimensional data plots using a three degree of freedom force feedback device linked to a speech synthesizer to aid in navigation. This system is designed to benefit visually impaired and blind persons, as well as sighted scientists, teachers and students who need or desire an alternate means to explore information traditionally displayed visually.

Background

The use of computers to model complicated sets of data into a sensible form is a concept that dates back to at least 1960. Visualization of information is becoming generally recognized as an important tool, and a major application of Virtual Environments (VEs) in a society undergoing a dramatic explosion of information. Many people are discovering that visual techniques (e.g, computer graphics), combined with additional auditory and/or haptic channels, allow the conversion of complex information into a more easily understandable format (1,2).

In the past, this involved the common sight of the two dimensional plot of data points onto a two dimensional surface such as graph paper. At best, the projection of three dimensional data onto some form of two dimensional display such as a computer screen helped with data interpretation. Basic research into tactual perception indicates that the introduction of kinesthetic information, such as that provided by a haptic display, increases an individual's ability to perceive and understand nonvisual information, which is especially beneficial for those individuals without sight (2,3).

System Design

The haptic graphing system consists of three major components: the haptic interface hardware, the graphing system software, and the speech synthesizer.

The hardware being used is the PHANToM Haptic Interface from SensAble Devices, Inc. (4) This device generates forces on a stylus, or on the user's fingertip in a thimble. Both the thimble and stylus are attached to a gimbal mechanism allowing three rotational degrees of freedom. For this system, the stylus is used, especially since it includes a switch for user input. The forces are actually only generated at a point (the Interface Point (IP)) which is located at the intersection of the three gimbal axes. The nominal position resolution within the 8x17x25 cm workspace is 0.07 mm, allowing higher spatial frequencies (e.g., a crack can be felt that is less than 0.5 mm wide). Bandwidth constraints dictate the stiffness of virtual objects and the dynamic range of forces (the control loop bandwidth is asynchronous at about 3 kHz). The position sampling rate, or rate that the program can run, cannot be less than 800 Hz. Hardware instability may arise in the form of uncontrolled oscillations if these constraints are not adhered to, but these oscillations may occur from other sources. With this device, the user can feel the lines or surfaces of a data plot that would traditionally be represented only graphically.

To aid in navigation and understanding of the data, the Text To Speech (TTS) system developed at the Applied Science and Engineering Laboratory is utilized to speak the current location (data value) on the plot. Any other additional information can also be accessed if necessary. Due to the bandwidth constraints of the PHANToM, in addition to the fact that the PHANToM runs on a PC and TTS only runs on Unix systems, TTS is accessed via TCP/IP and sockets. The lag time for this communication is minimal.

Development

The development focuses on taking advantage of the quality and versatility of the haptic interface. As a first step towards haptically rendering an n dimensional data set, a two dimensional, haptic graphing system is designed. The two dimensional plot is defined on a virtual wall, hereafter known as the plot wall. All virtual walls are defined by the plane equation Ax + By +Cz + D = 0. Due to the bandwidth constraints mentioned above, the force calculated is proportional to the distance of the IP into an object surface (the spring model F=kx). Once the wall is placed, the origin of the graph is selected, which provides the matrix to transform the data into the coordinate system of the haptic interface. Ideally, a haptic plot can be added to any virtual environment, so a bounding box is placed around the graph for basic collision detection, which determines force vectors. In this manner, collision detection for the individual graph items is not computed until the IP is within the bounding box, therefore saving valuable computation time within a complex VE.

The data to be plotted can be produced by any software that can output an ASCII text file containing pairs of [x,y] data points. The previously calculated transformation matrix is then used to determine the location of the data points in the haptic environment, located on the plane of the plot wall. It is not necessary to render the entire data set since the IP can only contact one point on the data curve at any given instant. A linear interpolation of the data determines the piecewise linear segments that are rendered when the IP is between any two abscissa values. For example, say that there are data points with abscissa values at 1, 2, 3, and 4. When the IP is inside the bounding box, and between points 2 and 3, only the line between 2 and 3 is active, as seen in Figure 1.

Figure 1: Front view of plot (HIP is Haptic IP)

Recall that the force generated by the application acts on a single point. This force is typically proportional to the distance of penetration into the object by the IP, similar to the wall representation. Since a line has no volume, the graph lines are represented in three dimensions with cylinders or planes.

Given that the graph segments are represented using cylinders, calculation of the ends of the cylinders, which would occur at each data point, is unnecessary since the adjacent cylinder becomes active immediately after the IP crosses a data point abscissa value, as Figure 1 displays. However, considering the IP is a single point, it is very difficult for a human operator to locate and follow thin cylinders in a 3D environment, especially without a visual representation. This concept has been demonstrated through experimentation. A simple solution would be to increase the cylinder radius; however, this may lead to slight ambiguity if the radius is too large, which is essentially a low resolution condition.

The method used to give a more accurate representation of the data that can easily be found in the VHE is based on the virtual fixtures metaphor (5,6). By using virtual fixtures, the cylinders can have a larger radius that make them easier to find. Once the IP breeches the cylinder surface, force cues are generated to move the IP to the axis (data line), much like a "snap-to-grid" function seen in many computer drawing packages. These forces are a function of the distance from the IP to the cylinder axis. Therefore, the user is guided along the data line since the only forces generated are perpendicular to, and toward this line. The weighting functions used to accomplish this is a modified version of the function described in (6). This function is given by where C is a scaling constant, and k1, k2, l1, and l2 determine the "feel'' of the fixture, and d is the distance to the data line or axis.

Figure 2: Cross-section of data cylinder force profile

Figure 2 shows a cross-section of a cylinder with a typical weighting function that has been successfully implemented. The left half of the equation is used to prevent instability of the hardware by preventing an instantaneous change from no force at d = 0, to a force of C at d > 0. Instability occurred because a human operator can not keep the IP exactly on the axis. In other words, when the human is taken out of the loop, the system is stable because there are no external forces to move the IP from the zero force line to the relatively large nonzero force region. The multiplication of the left component effectively creates an axis that has a small radius where no forces are generated, with a continuous transition (vs. discrete) to the nonzero force region.

Again, considering that zero volume lines cannot be found by a zero volume point, the plot grid lines and axes are represented by planes that are orthogonal to the plot wall. The axes planes are rendered as "stiff" walls so that forces are not generated until the user attempts to move outside the plot area on the plot wall, which eliminates interference with the forces due to the data lines. It is also important that the grid "planes" do not interfere with data interpretation. Thus, the grid "planes" actually have a nonzero depth (they are thin rectangular parallelepipeds) which produces viscous damping when the IP is within them. These "planes'' also extend to the bounding box. When the IP moves through these regions, a force is generated normal to the grid surface, and proportional to the IP velocity in the negative direction. This results in the common damping equation (F=-Bvn, where B is the damping coefficient, and vn is the velocity component normal to the grid plane). Experimentation found that the implemented method is unobtrusive and adequate to represent grid lines. This concept can be easily extended to a 3D graph as intersecting planes in three orthogonal orientations.

At any point on the plot, the user can press the stylus switch, which will activate the speech synthesizer. The TTS is sent the grid coordinates (with respect to the graph origin), which are consequently spoken to inform the user of the current location on the plot. Therefore, all relevant information about the plot can be obtained in a non-visual fashion.

Discussion

Simple plots (i.e., a small number of data points) have been represented with this system. The addition of the non-obtrusive grid and axes representations are necessary in a haptic rendering just as they are in a graphical rendering. This is also true for the values of the grid. The simplest solution for this is to take advantage of the sense of hearing. It would be difficult to represent Braille characters since the user could only feel one dot at a given instant with the PHANToM.

Future works

This system represents a method of haptically rendering data plots; however, experiments to prove that complex information can be comprehended in this fashion have not been performed. The limit on how much data can be represented in one plot before ambiguities occur must also be determined. Extending this representation to three dimensional data (e.g., mesh plots) is planned as well. This extension would involve representing surfaces or a method of giving the IP a third dimension (by placing a sphere around it) in order to feel discrete data points. The latter method can be thought of as a morphological structuring element similar to that used in morphological filtering for image processing.

Conclusions

The design of a haptic graphing system has several goals. It allows access to data plots by individuals who cannot view them in a graphical manner. Haptic rendering of two dimensional data using the concepts discussed in this paper will lead to improved conceptualization of complex information. The general population can also benefit since research has shown that the addition of another sensory input can enhance comprehension of complex information.

REFERENCES

[1] Brooks, F., Grasping reality through illusion - interactive graphics serving science, Fifth Conference on Computer and Human Interaction, CHI `99 Proceedings, 1988.

[2] Durlach, N., and A. Mavor, ed., Virtual Reality: Scientific and Technological Challenges, National Academy Press, 1995.

[3] Loomis, J. and S. Lederman, Handbook of Perception and Human Performance: Tactual Perception, John Wiley and Sons, Inc., 1986.

[4] Massie, T. and K. Salisbury, The PHANToM haptic interface: a device for probing virtual objects, ASME Winter Annual Meeting, Symposium on Haptic Interfaces for Virtual Environ- ment and Teleoperator System, 1994.

[5] Rosenburg, L, Virtual fixtures: perceptual tools for telerobotic manipulation, IEEE Virtual Reality Annual International Sym- posium, 1993.

[6] Sayers, C. and R. Paul, Synthetic fixturing, Advances in Robotics, Mechatronics, and Haptic Interfaces, vol 49, ASME, 1993.

Acknowledgments

This work has been supported by the National Science Foundation, Grant # HRD-9450019, with additional support from the Nemours Foundation Research Program. In addition, the authors would like to acknowledge Thomas Massie for information regarding the PHANToM.

Author address

Jason Fritz, or Kenneth Barner Applied Science and Engineering Laboratories Alfred I. duPont Institute 1600 Rockland Road Wilmington, DE 19899 Phone: (302) 651-6830 Email: fritz@asel.udel.edu, or barner@asel.udel.edu