Integrating Hypermedia and Assistive Technology: An Overview of Possibilities
One of the most useful technologies associated with microcomputers for teachers and caregivers for individuals with disabilities is hypermedia programs. Hypermedia programs allow individuals who do not know how to program a computer using a programming language to create computer software. With a minimum of training, hypermedia programs can be used to create very individualized software. This gives teachers and caregivers the capability to create computer programs such as Computer Aided Instruction (CAI) that will teach the specific objectives that are needed for their classroom. Hypermedia programs can also be used with assistive technologies to compensate for some disabilities. This paper will focus on the possible interaction between hypermedia and assistive input devices. Speech synthesis and hypermedia will also be explored.
Because of the uniqueness of individuals with disabilities, obtaining computer programs that help these individuals is difficult, expensive, or impossible. Some situations require software that may not already exist, or if it exists, then it must work with input methods that individuals with disabilities can take advantage of. Assistive input devices already exist and are easily interfaced with computers. Hypermedia programs can put the power of computer programming in the hands of those who can not program, but more importantly, those who need to tailor software for specific individuals with specific needs. Hypermedia files (stacks) can be used to communicate, instruct, or automate other computer tasks to improve access or productivity (Perkins, 1993, 1991). All that is needed is imagination and some computer skills.
The stacks created during this project were as a result of the needs of classroom teachers and caregivers based on their experiences with individuals with disabilities. _HyperCard_, a hypermedia program for the Macintosh was used along with the _TouchWindow_ from Edmark, _Ke:nx_ from Don Johnston, the _Unicorn Board_ from Unicorn Engineering, switches, and _Vocalize XCMD v1.0.2_ (see Appendix). The stacks dealt with communications, cause and effect, eye-hand coordination, finger tracing, scanning, and program launching.
Review of Literature
Hypermedia programs have gained in popularity since the original release of _HyperCard_ with the Macintosh computer. _HyperCard_ was originally created to be a database program, but has been adopted as a program able to create computer software. Software created by hypermedia programs have been used as tutorials for other software applications, to create interactive videodisc lessons, and even as data base managers. There are currently many different hypermedia programs available, however, each works only on a specific brand of computer (Erickson & Vonk, 1994; Lockard, Abrams & Many, 1994). _HyperCard_ , _SuperCard_ and _HyperStudio_ were created for the Macintosh, _LinkWay_ and _Multimedia ToolBook_ were created for IBM compatibles, HyperScreen and _Tutor Tech_ were created for the Apple II series (IIe, IIc, IIgs), and _HyperStudio_ is available for the Apple IIgs. Each of these programs has its own unique features, but the basic concept of stacks with cards is generally the same for all of them.
Files in hypermedia programs are called stacks or books. Each new computer screen is a card or page that has primarily the three following features: text, graphics, and buttons. Text and graphics provide instruction and information, and areas of the screen sensitive to mouse presses called buttons provide computer response to user input. For the user, pressing a button within a stack can make things happen. The stack may branch to another card and present new text or graphics, it might react to a choice made by the user, or other events may take place such as replaying digitized or synthesized sound or by showing videodisc sequences (Van Horn, 1991).
Teachers, caregivers and parents, even if they do not know how to program a computer, can use hypermedia programs as authoring languages (Male, 1994; Weibe, 1993). Authoring languages are intended to allow non programmers the capability to create computer programs without learning cumbersome computer languages (Erickson & Vonk, 1994; Weibe). By using hypermedia programs, many teachers and parents can create much of the software needed to teach specific objectives or accomplish tasks when that software does not already exist (Esposito & Campbell, 1993).
The amount of software available to be used by individuals with disabilities who have specific needs is very limited (Taber-Brown, 1993). Software created using hypermedia can help to fill this gap. Uses for hypermedia by the disabled vary with the individual. Some of the ways hypermedia programs may be used are: (a). To create Computer-Aided Instruction (CAI), (b) as a communication device, and (c) as a menu to launch other applications (Perkins, 1991). Stacks created by hypermedia programs may be operated by students with cognitive disabilities, communication disorders, physical disabilities, and those students who are unable to read (Perkins, 1993; 1991).
The primary input devices for most hypermedia programs are the keyboard and the mouse. The keyboard requires both discrimination to find keys and fine motor control to be able to press them. The mouse is used for computer input by moving the mouse on the table or desk which moves an arrow or other pointer on the screen, and when the mouse is pointing at the desired area on the computer screen, the mouse button is pressed. Many individuals with disabilities are unable to use a mouse because of the fine motor control necessary (Lahm & Greszco, 1988). Those using a mouse must also be able to associate moving a mouse on the desk with an arrow moving on the computer screen (Alessi & Trollip, 1991; Lahm & Greszco).
_HyperCard_ was chosen for this project because at the time the project was started, it was the most popular hypermedia program for the Macintosh (Lockard, et al., 1994; Van Horn, 1991). In past years, the authoring version of the program came packed with every Macintosh computer. However, in recent years, only the version called _HyperCard Player_ comes with the new Macintosh computers. _HyperCard Player_ allows stacks already created with _HyperCard_ to operate, but does not allow any authoring changes to stacks, the authoring version must be purchased separately. For those who are considering using hypermedia today, other hypermedia programs for the Macintosh do exist (_HyperStudio_ and _SuperCard_) that may have advantages in a given situation.
Unfortunately, sometimes a minimal amount of programming is necessary to get hypermedia programs to do what is needed, but that the programs were not made to do. Most hypermedia programs can be programmed in their own language (_HyperCard_ allows "scripting" using its language HyperTalk) (Lockard, et al., 1994; Van Horn, 1991). This also allows other computer programs to be accessible to _HyperCard_ in the form of Extended Commands (XCMDs) and Extended Functions (XFCNs). These are usually programs that others have written that can be accessed from _HyperCard_ that add to its capabilities. Because _HyperCard_ has been around the longest, there is an excellent collection of XCMDs and XFCNs available. Some were used in the following stacks.
_Alternative Input Devices_
There are two groups of alternative input devices for computers: those that require special programs that recognize the device (e.g. the _TouchWindow_ and _PowerPad_ on older Apple II computers), and those in which the device is completely transparent to the program that it is operating. Devices such as the older _TouchWindow_ only work with certain programs. Transparent devices, however, work with almost all programs (Lahm & Greszco, 1988), but some transparent devices require an additional (usually expensive) interface device to operate (Church & Bender, 1989).
Assistive Technology and _HyperCard_
For the purposes of this paper, input devices designed for the Macintosh will be described along with stacks created by _HyperCard_, but they could be adapted to other computers and other hypermedia programs. All adaptations here are either system software or a combination of system software and using the Macintosh's Apple Desktop Bus (ADB), which is the normal keyboard port. ADB devices are transparent, meaning they work with any program.
The least deviation from the normal input devices (mouse and keyboard), the better off an individual will be. Apple does supply with its Macintosh computers a system software program called Easy Access. This program is installed as a Control Panel program (in the System folder for System 6.XX and in the Control Panels Folder for System 7.XX). Easy Access has three features: Mouse Keys, Slow Keys and Sticky Keys. Mouse Keys allow a user to turn the numeric keypad into Mouse Keys. The following keys control mouse movement in the following fashion: 8-up, 2-down, 4-left, 6-right, 7,9,1,3-diagonal, and 5-mouse click. Initial delay (time before numeric keys are recognized) and the maximum speed of the mouse can be controlled. Slow Keys can be used to control some keyboard input.
Acceptance Delay controls how long a key must be pressed before it is recognized. Sticky Keys are used when an individual must press more than one key at a time (Shift-letter for capital or Command-P to print) but may not physically be able to do that. With Sticky Keys on, pressing the first key (Shift, Command, Option or Control key) makes the computer remember to keep it pressed while the next key is pressed producing the necessary input. For example, using Sticky Keys would allow one finger to type a capital letter. Depending on the System you are using, the program is on the Disk Utilities 2 (System 6.XX) or Tidbits (System 7.XX) disk that came with the computer.
Because some individuals cannot make the abstract connection between the mouse movement on the table and arrow movement on the computer screen, these individuals can benefit from an input device called the _TouchWindow_. The _TouchWindow_ is a clear plastic membrane that is fastened over the computer monitor and, on a Macintosh, plugged into the computer's keyboard port (ADB). Users view the monitor through the _TouchWindow_ and press the _TouchWindow_ where buttons appear on the monitor. The result is a relationship between the computer screen and events within the program when the _TouchWindow_ is pressed (Alessi & Trollip, 1991; Male, 1988). The _TouchWindow_ can also be used by individuals with poor motor control (Esposito & Campbell, 1993).
1. Cause and Effect--One of the first steps in learning to use a computer is the concept of Cause and Effect, if the user does something to an input device, then something will happen. Stacks can be created that are a series of cards with transparent buttons that cover the entire card. When the button is pressed, the stack could produce sound and then move to the next card. The author has created two stacks that are used to practice cause and effect. The first was a stack with the scanned images of animals and a digitized recording of someone saying the animal's name. When the _TouchWindow_ was pressed a new card would appear with an animal and a statement of what the animal was ("polar bear"). The second stack was a series of digitized images of the children in the classroom. When the _TouchWindow_ was pressed, an image of a child (Lisa) would appear and the sound "Hello Lisa" would be heard. Both sound and images should be used to provide as much reinforcement as possible. For these stacks, the _TouchWindow_ should be set for Touch in the _TouchWindow_'s Control Panel.
2. Tracing--The _TouchWindow_ allows children to trace images on the screen (monitor) using their fingers. For the purpose of this stack, a series of images consisting of trace by number pictures from a coloring book were used. The images were scanned and each was put in a separate _HyperCard_ Background so that they would not be affected by the student's tracing. By setting the access level for _HyperCard_ to Painting (3), and choosing the Brush Tool (all from a script), the finger movements of the student draws on the computer screen. Their task is to trace the picture that is on the screen. After they are finished, and they go to another picture, the card is erased, which removes the drawing they did, but because the original picture was in the background, it was unaffected by the erasing. For this type of activity, the _TouchWindow_ must be set to Standard in the Control Panel.
Ke:nx is similar to the Adaptive Firmware Card (AFC) for the Apple II computers and PC-Aid for IBM compatibles. Ke:nx accepts input from any switch, a mini keyboard, a Unicorn keyboard, and many other ASCII devices and alternative keyboards. Since Ke:nx is transparent to programs, no special instructions need to be given directly to hypermedia programs to recognize these alternative input devices. However, Ke:nx does need to be set up to translate alternative input into input a hypermedia stack can use. For the following examples, mouse input is what needed to be emulated.
Mouse emulation is one of the most difficult types of input to accomplish with alternative input devices. Because of the complication of using a switch to substitute as a mouse, the AFC manual states "Avoid mouse emulation...Look for keyboard equivalents" (Adaptive Firmware Card Operator's Manual and Application Guide, 1988, p.18-3). Standard keyboard equivalents do not exist for mouse clicking, however. Mouse emulation can be accomplished by a number of different devices connected to Ke:nx. With the Ke:nx software, input devices such as the Unicorn keyboard and the TASH Mini keyboard can be reprogrammed to emulate a mouse and its actions.
Users with limited muscle control may not be able to move or click a mouse because of their physical limitations. They may also not have the physical ability to reach the screen to operate a _TouchWindow_. These individuals may require specific switches that take advantage of a specific muscle or muscle group to provide input to the computer. These switches use Ke:nx to translate their input for the computer.
Physically disabled individuals must use the capability that is least restrictive in nature but provides the easiest access and speed for computer input. The simplest method of input that deviates the least from the norm will allow the fastest and most accurate control of the computer (Lahm & Greszco, 1988). If the mouse or the _TouchWindow_ can be manipulated, they would be the best choice for an input device. However, for some individuals, specialized switches are the only means of computer input.
One of the hardest devices connected with Ke:nx to program to work as a mouse is a switch. Normal use of switch input requires a switch press to let Ke:nx know input is coming and to show the possible choices, and then to have Ke:nx go through the possible choices. This process can require up to five switch presses just to choose a mouse click. Because Ke:nx can (a) be programmed and (b) understand Morse Code, for the purpose of the following stacks, a Ke:nx Setup (program) was created that had Ke:nx interpret both a short (dit) or long (dah) switch press as a mouse click. In either circumstance, only one switch press is necessary to produce a mouse click.
Cause and Effect--The same stacks that were used above for cause and effect training were used here, the only difference was the input device. Whereas above it was the _TouchWindow_, here it was a switch.
Eventually, more than one choice (button) is going to appear on a screen (card) at a time. Branching due to buttons is one of the great benefits of hypermedia, stacks do not have to be linear. For users without the ability to move and click a mouse or touch different parts of the computer monitor by using the _TouchWindow_, an alternative is necessary. Scanning is traditionally the way switch users provide input to a computer. It is a process where the possible choices available to a user are shown on the computer screen, and a cursor moves through the choices. When the preferred choice is highlighted, it is selected by a switch press (Green & Brightman, 1990). Users do not have to move a mouse or type in their responses. The computer automatically provides the choices and waits for the user to make a selection.
Hypermedia can be used to create stacks that provide automatic scanning through the choices. Scanning is no more than a menu that cycles through the possible choices indicating one choice at a time. In Hypermedia, this menu can be made up of a series of buttons, but with one of the buttons highlighted to make it different from the others. For each of the buttons in the cycle, there is an associated action. If there is no activity in a specified amount of time, the highlighting moves to the next button. The user sees the highlighting go from one choice in a menu to the next.
Program Launcher--The stack that was created using scanning was a Program Launcher. The menu that was offered consisted of four programs that a user may need to have offered on startup of the computer. The programs offered were ClarisWorks (for word Processing), CoWriter (for word prediction), Zterm (for telecommunications), and SuperPaint (for graphics). Using an adaptation of the scanning script from a stack created by Bill Lynn (see Appendix), each of these choices was offered. When the appropriate choice was highlighted, a switch was pressed, and a script launched the application (e.g.. ClarisWorks). Upon quitting the application, the user was brought back to the Program Launcher. By placing this stack into the Startup Items folder in the Macintosh's System Folder, as soon as the computer is turned on, this stack would be automatically run.
The author has been informed that the Unicorn keyboard is no longer available and that it has been replaced by IntelliKeys. Despite lack of direct experience using this device, a couple of observations about it will be made. The first is that the device does not require an interface device such as Ke:nx to connect to the Macintosh (in fact the same IntelliKeys can be unplugged from a Macintosh and plugged into an IBM or Apple II with the right adapters). In observing others at conference presentations using IntelliKeys, it was impressive to see how easy the device was to reprogram and use. This device has many capabilities that could be used with hypermedia and should be explored by potential consumers.
Speech within stacks can be very useful in two situations to speak for someone or to read to someone. Stacks can be created to provide the capability to speak for those who do not have that capability. Specialized speech synthesis devices do already exist to accomplish the task of speaking for those who are unable to, but the cost of these devices is high and the devices need extensive programming (Carey & Sale, 1994) by teachers or parents. One advantage of using a computer is it is multipurpose, not just a speech synthesizer, but it can be a communicator, word processor, note taker, and when necessary, provide recreation too.
Speech can also be used in situations where visually impaired individuals or individuals who can not read would be using the hypermedia stacks. The stack could be programmed to read all the information that appears on a card. This would include text fields as well as all button choices. Cues or instructions on how individuals should respond to cards could also be provided.
Speech can come in two different forms, digitized (actual voices recorded) and synthesized (the computer trying to phonetically reconstruct vocal sounds). The Macintosh computer comes with built in speakers and has the capability of producing sounds. _HyperCard_ comes with the capability to digitize (record) sound built in. By using the Audio Stack that comes with _HyperCard_, actual recordings can be made. This is effective for a limited amount of sound recordings because digitized sound requires much Random Access Memory (RAM) for processing the sounds as well as disk space to store the sounds (Church & Bender, 1989). It also requires someone to provide the voice for recording.
Synthesized speech uses much less memory (both RAM and disk space), however it does have a disadvantage, the voice usually sounds "robotic" or computerized (Church & Bender, 1989). Because speech is constructed phonetically from text that is typed into the computer (Male, 1994), and because English is not 100% phonetic, sometimes the reproduced speech is not accurate.
The Macintosh also comes with the capability to synthesize speech using system software called MacinTalk. Newer versions of this software called MacinTalk 2 and MacinTalk Pro are now available (see Appendix). MacinTalk Pro uses a variety of high quality male and female voices but needs more RAM to be used. MacinTalk Pro will not work on a Macintosh with only 4 megabytes but MacinTalk2 will. MacinTalk 2 also comes with alternative voices allowing for female and male sounding voices.
_HyperCard_ needs an XCMD to be able to access the speech synthesizer. One was found as freeware on INTERNET called _Vocalize XCMD_ v1.0.2 (see Appendix) that uses either MacinTalk2 or MacinTalk Pro. _Vocalize XCMD_ v1.0.2 can be adapted to receive typed in input to be read or scripted information to produce speech by pressing a button. _Vocalize XCMD_ v1.0.2 comes with instructions on how it can be used within a stack. Also included is information about MacinTalk2 and MacinTalk Pro and where they can be obtained.
Talker--This stack was prepared to deal with speaking using synthesized speech in one of two ways, speaking phrases that have already been stored (using buttons) or by speaking what is typed into a text field. Many phrases are used repeatedly by individuals ("I'm hungry") whereas there are other times where conversation involves spontaneous speech. This stack can deal with both. When this stack is installed on a lap top computer, such as a Macintosh PowerBook, an individual can have the capability of speaking almost anywhere.
Teachers of individuals are faced with a lack of software appropriate to the disabilities that are encountered. Since there are fewer individuals with these specific needs, software required to teach specific objectives or provide accommodations may be hard to find or expensive. Hypermedia programs can provide educators, caregivers or parents the ability to author their own software to accomplish these tasks (Church & Bender, 1989; Perkins, 1993, 1991; Van Horn, 1991) for use in their classroom, at home, or out in public.
Hypermedia, because of its flexibility, can be one of the best tools for teachers, parents, and caregivers to utilize in aiding individuals with disabilities. With imagination and some programming skill, but no where near as much skill as that required by programming languages, software can be created to suit almost any need or educational objective.
There are many uses of hypermedia that have not been explored here such as using it as a presentation tool and database manager that may also warrant exploring. The beauty of hypermedia is that it can be almost anything, and is only limited by imagination. The skills necessary to use hypermedia and the equipment can all be obtained already. As an example, at a conference presentation, the author was asked "How would someone who was blind use this type of program since it was graphically oriented." In that session, we brainstormed a solution--by placing a clear plastic sheet over the _TouchWindow_ and marking standard locations for buttons with something like yarn glued to the plastic, individuals with vision disabilities could feel where the buttons are. The _TouchWindow_ does not even have to be on the computer monitor, it could be placed on a desk or table.
Not everything will work the first time, some trial and error may be part of the developmental process. Anytime you combine software and hardware that hasn't been tried previously, some problems do occur. For example, depending on what the student's intended response is, the _TouchWindow_ must be set up differently. It may be just a matter of finding the correct adjustment. The _TouchWindow_ also needs to be calibrated for the size monitor that is being used.
There are also many places to go for help in learning about hypermedia programs. Hypermedia classes are being taught at schools as inservice training, continuing education courses, college graduate courses and at technology conferences. If access to telecommunications is available, INTERNET and the commercial services such as America Online provide free or inexpensive stacks. These services also provide discussion areas where questions and problems can be addressed.
By adding a little imagination, computer skill, and the necessary assistive technology, working with individuals with disabilities can produce very satisfying results. Hypermedia is the main ingredient because of its flexibility. The need for the stacks already exists.
The author has made the _HyperCard_ stacks that were mentioned in this article. They can be obtained by sending a High Density disk and self addressed stamped envelope.
Button Scanning v1.2, (1992) Bill Lynn, Simtech Publications,
134 East St., Litchfield, CT 06759
Vocalize XCMD v1.0.2, (1994), Alex Metcalf,
mrcnext.cso.uiuc.edu (18.104.22.168), in the directory:
MacinTalk 2 is available at:
in the directory:
Adaptive Firmware Card operator's manual and application guide_,
(1988). Wauconda, IL: Don Johnston Developmental Equipment, Inc.
Alessi, S. M. & Trollip, S. R. (1991). _Computer-based instruction:
Methods and development_. Englewood Cliffs, NJ: Prentice Hall.
Carey, D. & Sale, P. (1994). Notebook computers increase
communication. _Teaching Exceptional Children, 27_(1), 62-69.
Church, G. & Bender, M. (1989). _Teaching with computers: A
curriculum for Special Educators_, Boston: A College-Hill Publication.
Erickson, F. J. & Vonk, J. A. (1994). _Computer Essentials in
Education_, New York, NY: McGraw Hill.
Esposito, L. & Campbell, P. H. (1993). Computers and individuals
with severe and physical disabilities. In Lindsey, J. D. (Ed.),
_Computers and Exceptional Individuals_. (pp. 159-171). Austin, TX: Pro-Ed.
Green, P. & Brightman, A. J. (1990)._Independence day: Designing
computer solutions for individuals with disability_, Allen TX: DLM.
Lahm, E. A., Greszco, K. (1988). Therapeutic applications and
adaptive devices. In Behrmann, M. M. (Ed.), _Integrating computers
into the curriculum: A handbook for special educators_ (pp. 29-58). Boston: Little, Brown and Co.
Lockard, J., Abrams, P. D., & Many, W. A. (1994). _Microcomputers
for twenty-first century educators_ (3rd. ed). New York, NY:
HarperCollins Publishers, Inc.
Male, M. (1988). _Special magic: Computers, Classroom strategies,
and exceptional users_. Mountain View, CA: Mayfield Publishing Company.
Male, M. (1994). _Technology for inclusion:Meeting the needs of all
students_ (2nd. ed.). Boston, MA: Allyn and Bacon.
Perkins, R. (1993). Integrating alternative input devices and
hypermedia for use by exceptional individuals (in press).
_Computers in the Schools_, 10(1-4).
Perkins, R. (1991). Using HyperStudio to create lessons that use
alternative input devices. In Carey, D., Carey, R., Willis, D. A.,
& Willis, J. (Eds.). _Technology and Teacher Education Annual 1991:
Proceedings of the Annual Conference of the Society for Teacher
Education_ (pp. 80-83). (ERIC Document Reproduction Service No. ED 343 562).
Taber-Brown, F. M. (1993). Software evaluation and development. In
Lindsey, J. D. (Ed.), _Computers and Exceptional Individuals_ ( pp.
65-82). Austin, TX: Pro-Ed.
Van Horn, R. (1991). _Advanced technology in education_. Pacific
Grove, CA: Brooks/Cole Publishing Company.
Weibe, J. H. (1993). The software domain. In Lindsey, J. D. (Ed.),
_Computers and Exceptional Individuals_ (pp. 45-64). Austin, TX: Pro-Ed.