音声ブラウザご使用の方向け: SKIP NAVI GOTO NAVI

Web Posted on: December 3, 1998


MAKING THE WINDOWS ENVIRONMENT TOUCHABLE - FEEDBACK THAT REALLY MAKES A DIFFERENCE!

Alan Holst
Blazie Engineering
e-mail: Alan@blazie.com

Introduction:

This presentation will focus on tactile access to the Windows environment. We will explore the specific benefits of tactile feedback for people who work in a graphical environment with no visual clues. In other words, this session will be of much more interest to people who are totally blind than to those who have enough vision to detect subtle changes on the screen. We will explore the specific benefits to two forms of tactile feed back; a Braille display and the Virtual reality mouse. We will focus on how tactile tools can supplement and enhance the information that is available through speech.

I've heard it said that Windows was designed to be discoverable. In other words, when you see it you will understand it. Or, your understanding of the Windows environment is incidental; it's a direct result of your experience in it. As I observe sighted people work in Windows; it appears that their learning is serendipitous. They use their mouse to poke around the screen until they discover the thing they are looking for. The question for blind users then is how can that experience be made as feedback rich as possible? How can the blind user have enough tangible experience to learn about Windows by operating it?

One of the goals of Windows was to make computer use easier; to make it the operating system of the common person. A big part of that is making the process less work, so that the user can focus more on what he is trying to achieve. In other words, the product is more important than the process. Or, operating the tool shouldn't get in the way of getting the job done. But I think we can agree that for the majority of blind people, the new tool (Windows) is much harder to use than the old tool (DOS or a portable note taker).

I think we can all agree that blind users are capable of using a keyboard as well as anyone else; we are however seriously challenged by the kind of feedback that Windows provides. In fact, one of the learnings I hope attendees carry away from this session is the idea that successful Windows usage by the blind is almost completely a factor of enhancing the feedback loop. If those of us who can't see the screen could discover things as naturally as sighted users do, we'd love it too.

Windows is particularly suited to vision. It's about the relationship between things. Vision is a parallel input source: you are able to watch more than one event at a time, which promotes your understanding of relationships, of cause and effect. In other words, when I push this button, that thing happens; therefore, this button causes that effect. Speech or Braille are analogous to a serial input; one can only see one thing at a given point in time. When you can use both modalities together, the experience takes place in real time. When you have 2 modalities available, it is much easier to understand the relationship between 2 simultaneous events. If the blind user is to understand the effect, the feedback loop must at least provide enough information to make that effect perceptible. In other words, if you hit a button, you need to see its effect in a timely manner if you want to trust and feel comfortable with the system.

The Windows environment demonstrates the axiom "The medium is the message". In other words, context, color, and attributes will augment the alphanumeric characters on the screen with additional significant information. Be that as it may, we have to deal with it; trying to stay in the past, in a more command argument or language-oriented environment is futile.


Strategies for access to a graphical environment:

Over the last several years I have heard 2 very different philosophies about Windows access.

One philosophy says that you don't need to know what the screen looks like to work in Windows. In this view, the blind user focuses on memorizing a plethora of options to achieve a given function or effect. This approach is extremely abstract and memory intensive. It's cooking by the recipe. A good cook can be creative within or without a written approach to a given problem, but he must either be able to taste the food or have an extraordinary memory of how ingredients taste. It's been my experience that when forced to rely on speech alone, I don't get enough feedback to be an effective improviser. I can achieve a task as long as nothing unusual happens, but if I get off the beaten path I'm in trouble. Since Windows is such an automated environment, I find myself off the beaten path all too often.

In other words, assumptions that are based completely on my own work are frequently wrong because Windows has also made assumptions about what I want. The person who uses speech only to work on a full screen may be able to get layout information, but either he must specifically seek out that information or he must ask the program to articulate changes. That of course interferes with the user's understanding of the text. It's been my experience that those who lose vision as adults (the ex-sighted) don't struggle with this as much as I do, but in a system where the layout provides context for a sighted user, I feel cheated if I don't have access to that information. I also feel uncomfortably dependent on those who have written the programs or scripts that I must rely on.

Another way of looking at the access technology is to say that you need a good cognitive map of the screen to be an effective Windows user. Tactile access tools make the screen more concrete and provide a more direct way to understand where subtle changes in text occur. While I may be able to play the same tune over and over again with speech alone, I find that when I employ tactile systems, I am able to improvise to be much more creative.


Where does speech fit in to the Windows access picture?

Speech alone is an incomplete medium when it comes to providing all the information one truly needs to be effective in Windows. Speech is one-dimensional while the Windows environment is truly 3 dimensional. As far as a blind user is concerned, I suggest that colors and attributes provide yet another (fourth) dimension. When the information about context and attributes comes through the same channel as the text, subtle visual changes are difficult to present without distorting the natural flow of the content. I experience the presentation of this "visual information" as just so much noise in speech. But information about attributes or screen position may be very important and I may need that information to do my job effectively.

My comments aren't intended to suggest that speech is a bad thing, rather I'm suggesting that tactile information can significantly enhance a blind users understanding of the Windows environment. They are not either chicken or beef, rather they are meat and potatoes.

In a good design, the tactile completes the audio. It fills out the picture and makes the spoken information more complete. Often, I will have my hand on the Braille display as the speech is reading; not because I'm reading the text, but because I want to watch format changes or I want to touch the attribute information as the text is spoken. When used in this way, Braille is filling the role of perifial vision.

When Braille and speech are used together, there are 2 very different techniques that may be used. One approach is to let the Braille track the speech, but perhaps its purpose is merely to show format and attribute information. As the speech is reading along, the user's hand rests on the display so he can see when lines change, how they might be numbered or indented, or whether attributes seem to be assigned. The second method is to separate the speech and the Braille so that you watch one area of the screen while typing in or moving around another area of the screen. This method might be an excellent way to learn about dialog boxes where one field forces a change in another field.

I find that there are 5 particular areas where tactile feedback is especially helpful.

First, I've found that tactile systems are better for locating subtle changes on the screen such as where a particular attribute begins or ends. I also reach for the Braille display if I want to know that a certain icon or item is highlighted.

Second, I find it much easier to know where text will go when I see the cursor on a Braille display. I know that many do this with speech alone, but when I try working without a Braille display I frequently insert text in the wrong place. One could argue that this is a function of experience and I would agree. But I still maintain that your students and clients will feel that Windows is more reliable-trustworthy when they can see the cursor on a Braille display.

The third place where I find Braille invaluable is making sense of the chaos on the screen. It helps me understand boundaries like where a line begins and ends. It's particularly good for things like watching the left edge of an outline to understand what the current level is.

The fourth feature is one that is difficult to explain in a way that can really communicate its importance, but Braille in conjunction with the tactile mouse we will talk about lets me see a thing happen in real time. This effect gives me much more confidence in the system than checking out the effect of a button push after the fact. It's really nice to see an effect, immediately, when you push a button. It may not truly add to productivity, but it sure does contribute to greater confidence in the system.

Finally, Braille is a way of reality testing. In DOS it's how I could be sure things were lined up. Now, with the proportional spacing in Windows, that's not so clear any more; but it still does indeed offer a clearer picture than speech alone does. As a result, it does give me more confidence.

There is another issue that should be raised in this vein. With tactile tools, the blind user can get a much clearer picture of what's actually on the screen. This makes it much easier to work with sighted co-workers because you have a means of looking at things together. One anecdote that may be of interest here is that the mouse and Braille display have made it much easier for me to talk with sighted people about what is on the screen because they have a way of pointing to things for me.

When you think about what tools a blind user requires in Windows, it would be natural to argue that Braille skill is needed to make good use of a Braille display. This of course is true, but remember, the main purpose of the Braille isn't to read as one would use it to read a book, rather it is to provide details that are hard to get to with speech. In fact, another key learning that I hope you carry away from this session is that a high level of Braille literacy should not necessarily be the most significant determining factor for whether your client or student requires a Braille display.

When you consider the list of benefits to Braille listed above, you will recognize that the ability to read Braille quickly has nothing to do with the ability to gather the kinds of information listed previously. If you think of the speech as a way to echo or mirror the Braille, then the Braille skill of the user may be a predictor of how often he puts his hands on the Braille display; but when the Braille is ancillary and understood to be available to enhance the users understanding, you will find that the Braille skill is only somewhat correlated to the frequency with which the user touches it.


Menus screen reading commands and memory:

One of the most compelling arguments against Dos and for Windows was demand on the user's all too human memory. Many said that they couldn't remember all of the commands; they wanted to be able to see their choices. One of the skills that blind people are encouraged to develop as an alternate technique is memory, so we were well suited to working in DOS command oriented environments.

In the new world, memory is stretched to a much higher degree. Further, the user needs to remember a host of screen reading commands. I like to think that I have a pretty good memory, but I certainly don't remember all the commands I need. This has 2 implications as it relates to Braille.

First, any method that can support the memory of commands is helpful. Certainly numonics are good, but so is knowing something about how a screen is laid out. Seeing it in Braille will help some remember the command structure. Interestingly, one of the common memory techniques frequently taught by memory trainers is association. I've read that some great orators delivered speeches from memory by associating things with rooms of their home or other favorite places. I find that associating a command with it's location on the screen helps me remember it when I need it. The physical location helps me categorize the features and functions. In Windows, such categorization can be a real memory aid.

Second, Braille mnemonics are also a big help. Naturally, this depends somewhat on what you are using for a Braille display, but at least all displays will have left/ right, up and down movement commands. In the driver that we make available for Braille Lite, we try to take advantage of all the possible Braille numonics. Left handed commands to move up and left; right handed commands to move down and right. There is still much to remember, but at least we can take some of the burden off the heavily laden PC keyboard with keys on the Braille display.

There is another factor that any serious Braille user will need to consider when designing a work station-that's the issue of equipment layout. The first big issue here is simply the location of the components. Virtually all of the dedicated displays are designed to fit bellow the PC keyboard. Actually, our product (the 40 cell Braille Lite) has some terrific human factors in terms of display movement by virtue of the Braille keyboard. It also has a significant disadvantage; you can't put a keyboard on top of it. I use a keyboard tray that goes under the desk so there is plenty of room for the Braille Lite on top of the desk and it's easily within reach. I urge you to think carefully about how equipment will be laid out when planning a workstation.

While we're on the subject of human factors, there is another product which is extremely intriguing; the Virtual Reality Mouse from Control Advancements. I'd encourage anyone who is serious about tactile access to Windows to at least take a look at this technology. I believe it is particularly suited to work with a Braille display and perhaps it is better with shorter displays than with longer displays.

It has the following advantages.

First of all, I use it as a Braille navigation tool. This is particularly handy with shorter Braille displays. The mouse allows me to make finite adjustments in the positioning of the Braille display. It also lets me scan through a screen relatively quickly to locate a particular place or item on the screen. It also significantly reduces the number of commands I need to memorize. Sense I'm using an actual mouse, I don't need a large number of commands to emulate mouse commands. This may seem like a small matter, but when you are in a hurry and can't remember exactly what command is needed to do a particular thing, say drag and drop an item, it's a real plus.

Another thing that has been extremely revealing is that the mouse lets me go places I have had difficulty getting the Jaws mouse to go. Early in my Windows experience, sighted people would frequently make reference to things on the screen which I could never get to with the Jaws cursor. Again, this isn't meant as a criticism of Jaws, they had to design a system that would work well for a speech user and control the chaos on the screen; but the Virtual Reality mouse has made it possible for me to use Windows more as a sighted person might as I am able to move the mouse in to places where the Jaws cursor didn't want to go. Another big benefit is that I can use it to learn about the actual location of things on the screen. Is that important? Technically, probably not-but after going over a screen with the VR mouse I find that my mental map of where I am is much more accurate and therefore much more useful. The other thing that's interesting about this is that there is no such thing as proportional spacing in Braille. With the mouse, it is now possible to not only determine where things fall on the screen, but also approximately what the size is. It's not an absolute picture, but again, it helps to relate to what a sighted user sees.

One of the best examples of this is the scroll bars. I was amazed to find that the left/ right scroll bars are actually a visual representation of the amount of information to the left or right. Sure, someone can describe it; but what's the saying? "A picture is worth a thousand words." The VR mouse can provide that picture. I will not argue that using mouse commands is better than using keyboard commands, but it sure is nice to read a book or documentation about Windows and be able to execute the mouse command without having to scramble for a keyboard alternative which may or may not exist.

These are just a few of the benefits to tactile feed back systems. Hopefully, this discussion has provided some insight into the benefits of using tactile tools to supplement the information that is provided by speech systems in Windows.