Engineering:Optacon
The Optacon (OPtical to TActile CONverter)[1] is an electromechanical device that enables blind people to read printed material that has not been transcribed into Braille. The device consists of two parts: a scanner which the user runs over the material to be read, and a finger pad which translates the words into vibrations felt on the finger tips. The Optacon was conceived by John Linvill, a professor of Electrical Engineering at Stanford University, and developed with researchers at Stanford Research Institute (now SRI International). Telesensory Systems manufactured the device from 1971 until it was discontinued in 1996. Although effective once mastered, it was expensive and took many hours of training to reach competency. In 2005, TSI suddenly shut down. Employees were "walked out" of the building and lost accrued vacation time, medical insurance, and all benefits. Customers could not buy new machines or get existing machines fixed. Some work was done by other companies but no device with the versatility of the Optacon had been developed as of 2007. Many blind people continue to use their Optacons to this day. The Optacon offers capabilities that no other device offers including the ability to see a printed page or computer screen as it truly appears including drawings, typefaces, and specialized text layouts.
Description
The Optacon consists of a main electronics unit about the size of a portable tape recorder connected by a thin cable to a camera module about the size of a penknife (See Fig. 1).
The main electronics unit contains a "tactile array" onto which the blind person places their index finger. The Optacon user moves the camera module across a line of print, and an image of an area about the size of a letterspace is transmitted via the connecting cable to the main electronics unit. The tactile array in the main electronics unit contains a 24-by-6 matrix of tiny metal rods, each of which can be independently vibrated by a piezoelectric reed connected to it. Rods are vibrated that correspond to black parts of the image, thus forming a tactile image of the letter being viewed by the camera module. As the user moves the lens module along the print line, tactile images of print letters are felt moving across the array of rods under the user's finger. The Optacon includes a knob to adjust the intensity at which the tactile array rods vibrate, a knob to set the image threshold between white and black needed to turn on the vibration of the rods in the tactile array, and a switch that determines whether images will be interpreted as dark print on a light background or as light print on a dark background.
Lyle Thume, an Optacon user and director of blind rehabilitation services at the Rehabilitation Institute in Detroit summed up the Optacon this way in 1973: "It opens up a whole new world to blind people. They aren't restricted anymore to reading material set in braille."[2]
History
The Optacon was developed by John Linvill, a professor of Electrical Engineering at Stanford University, who later became head of the Electrical Engineering Department. The Optacon was developed with researchers at Stanford Research Institute (now SRI International). Linvill was one of Telesensory's founders and Chairman of the Telesensory Board. The initial stimulus for development of the Optacon was Linvill's daughter, Candy (born 1952, blind since the age of 3). Using the Optacon, Candy graduated from Stanford and received a PhD. She has worked as a clinical psychologist since, so, like her father, she is often referred to in the press as "Dr. Linvill".
In 1962, during a sabbatical year in Switzerland, Linvill visited an IBM laboratory in Germany, where he observed a high speed printer that used a set of small pins—like hammers—to print letters onto strips of paper. He thought, "If you could feel the hammers with your fingertip, you could surely recognize the image." So on our return to Zurich, I told my wife and son and daughter, Candy, who was blind: "Guys, I've got the most magnificent idea. We'll make something that will let Candy read ordinary printed material." And although his family laughed at this notion, "oh, that'll never work!" the idea for the Optacon was born.
Upon returning to Stanford, Linvill, together with graduate students G.J. Alonzo and John Hill, developed the concept further with the support of the Office of Naval Research. A key aspect of Linvill's concept was to use vibrating piezoelectric reeds, called bimorphs, to move the pins in a two-dimensional array to produce tactile images.[3] The idea of using vibrating bimorphs was critical for several reasons:
- The high power efficiency of the piezoelectric bimorphs made a battery-powered reading machine possible.
- The small size and weight of the bimorphs was also essential for portability.
- Later psychophysical experiments discovered that vibration around the resonance of conveniently sized bimorphs was optimum for the sense of touch.
In 1964 Linvill applied for a patent, and U.S. Patent 3,229,387 was granted in January 1966.
Early history
Amazingly, in 1913 a reading machine for the blind, called the optophone, was built by Edmund Edward Fournier d’Albe in England. It used selenium photosensors to detect black print and convert it into an audible output which could be interpreted by a blind person. A small number were built with reading being exceedingly slow for most people. The concept of a tactile optical scanning device can be traced back to 1915, as mentioned (and dismissed) in Fournier d'Albe's 1924 book, The Moon-Element.[4] Described as a device using iron pins stimulated by electro-magnets to convey dark and light in a tactile manner, questions hung over the feasibility of such a device, and indeed, if it existed at all at that time.
Of note, in In 1943, Vannevar Bush and Caryl Haskins of the wartime Office of Scientific Research and Development directed resources toward the development of technologies to assist wounded veterans. The Battelle Institute was provided with funding to develop an improved Optophone and Haskins Laboratories was funded to conduct research toward a synthetic speech reading machine. This group turned “sour” on the Optophone approach after concluding that reading would be too slow.
In 1957 U.S. Veteran's Administration, Prosthetic and Sensory Aids Service (PSAS), under Dr. Eugene Murphy, began funding the development of a reading machine for the blind. The principal investigator on this project was Hans Mauch, a German scientist brought to the U.S. after World War II. (During World War II Mauch worked for the German Air Ministry as part of the German V-1 missile development team.)
Mauch worked on reading machines having an “optophone-like” output, a “speech-like” sound output, and a synthetic speech output. The only one of these that was competitive to the Optacon development was the Stereotoner, basically an improved optophone. The Stereotoner design concept was that the user would move a vertical array of photosensors across a line of text. Each photosensor would send its signal to an audio oscillator set to a different frequency, with the top photosensor driving the highest frequency and the bottom photosensor driving the lowest frequency. The user would then hear tones and chords from which the letters could be identified.
Initially Linvill was unaware that the Optacon was not the only reading machine for blind people under development. However, in 1961 James Bliss had returned to SRI from MIT where he had done a doctoral dissertation in a group working on the application of technology for the problems of blindness. Bliss was interested in basic research on the tactile sense, to better understand how it could be used to substitute for loss of vision. While at MIT, Bliss became aware of the existing research and development on reading machines for the blind, as well as the researchers and funding agencies. At SRI Bliss had obtained funding for his tactile research from the Department of Defense and NASA, who were interested in tactile displays for pilots and astronauts. This had enabled him to obtain a small computer and develop software to drive hundreds of tactile stimulators he had developed for research purposes. These tactile stimulators were small air jets, which were ideal for research because their arrangement and spacing could easily be changed and the contact to the skin was always assured. Bliss was studying how well subjects could recognize dynamic patterns presented on his array of air jet stimulators.[5]
Funding for Optacon research and development
After Linvill and Bliss decided to join forces to work on Linvill's vision of a reading machine, it became apparent that they needed to obtain funding for this objective, rather than the objectives of Department of Defense and NASA which had provided the funding up until that time. As a start, Bliss suggested that they visit Dr. Murphy at the VA, since he was the only then currently active government source of reading machine funding. However, Bliss knew that the research on “Optophone-like” reading machines had created negativity toward this “direct translation” approach because of the slow reading rates obtained. To counter this negativity, Bliss programmed an SRI computer to present text in a moving belt display, similar to that in Times Square New York City, on both his air jet stimulator array and on the Stanford bimorph array. Linvill's blind daughter, Candy, was then the subject who attempted to learn to read the text presented in this fashion. After several hours of training and practice, Candy was reading in excess of 30 words per minute. Bliss and Linvill felt this computer driven test was a valid simulation of the reading machine they proposed to develop. They felt the 30 words per minute reading rate achieved in a short time by Candy proved that if such a reading machine were developed, it would be useful. They didn't know what the upper limit of reading speed would be, but had hopes that 100 words per minute could be achieved, since this was typical Braille reading rate.
Armed with this result, Bliss and Linvill made an appointment to visit Dr. Murphy in Washington, D.C. Initially the meeting was going very well, with Dr. Murphy seeming to be very positive toward the possibility of funding the development. Murphy then mentioned that Linvill would have to assign his patent to the Veterans Administration. Linvill refused and the meeting abruptly ended.
As it turned out, this rejection was fortunate. The Office of Education was directed by a colleague of Linvill's from when he worked at Bell Laboratories. Development of a reading aid for the blind was very relevant to their mission since providing instructional material to blind mainstreamed students was an important problem. Linvill presented the Optacon idea to the Office of Education and it was enthusiastically received. This led to funding at a higher level (over $1.8 million of 1970 dollars over 4 years) than would have been likely from the Veterans Administration.
This higher level of funding was necessary to develop the custom integrated circuits that enabled the Optacon's small size, which was critical to its success. The Optacon project also assisted Stanford in establishing their Integrated Circuits facilities, leading MIT's Dean of Engineering to remark that Stanford got the lead in integrated circuit research because of the Optacon.
Development of the Optacon
With funding established, Bliss joined the Stanford faculty half-time, the other half being at SRI. At SRI tactile reading experiments were conducted to maximize the reading rates achievable with the Optacon, as well as development of the bimorph tactile array and the optics for the camera. At Stanford custom integrated circuits were developed including the silicon retina and the drivers for the bimorphs, since they required a higher voltage than normal for solid state circuits at that time.
The first technical challenge toward developing the reading machine was how to build a "tactile screen" that could create a dynamic tactile image which was perceivable by the user and that had a refresh rate fast enough for useful reading rates. Linvill's initial work with graduate students Alonzo and Hill indicated that a piezoelectric bimorph could be suitable as the transducer to convert an electrical signal into a mechanical motion. The advantages of bimorphs were efficient transduction of electrical to mechanical energy (important for battery operation), small size, fast response, and relatively low cost.
Alonzo determined that at vibration frequencies around 300 Hz, the amplitude needed for detection was much less than for frequencies around 60 Hz. Moreover, for reading rates of 100 words per minute, vibration rates of at least 200 Hz were needed. Linvill calculated the length, width, and thickness of a bimorph reed necessary for a resonance frequency of 200 Hz that could produce enough mechanical energy to stimulate a fingertip above the threshold of the sense of touch.
Based on these calculations, an array of bimorphs was constructed for reading rate tests with the computer simulation at SRI. The computer simulation presented tactile images of perfectly formed and aligned letters in a stream that moved across the bimorph array. Candy Linvill and other blind subjects learned to read text presented in this fashion with encouraging results. However, this simulation differed from the conditions that the user would encounter with an Optacon in the real world. There would be a wide range of type fonts and print qualities, plus the user would have to move the camera across the text rather than the computer moving the text across the tactile screen at a fixed rate. It wasn't known how much the mental load of controlling the camera would reduce the reading rate.
In considering the transition from the text being presented by the computer to the user moving a camera across a printed page, Bliss realized that there was a critical flaw in the design of the Veteran Administration Stereotoner. Since English alphabetic characters can be adequately displayed with 12 vertical pixels, the Stereotoner designer had assumed that only 12 photocells would be needed in the camera. However, this assumes perfect alignment between the camera and the printed text, which is never the case with a hand held camera. When the alignment is random, as with a hand held camera, a well known engineering theorem states that twice as many pixels are needed.[6] Therefore, the Optacon was designed with 24 vertical pixels instead of 12. This theorem isn't applicable in the horizontal dimension, so the columns in a two dimensional array can be twice as far apart as the rows.
When a single column of 24 pixels is scanned across a line of text, all of the information is acquired. However, with the sense of touch, people are capable of perceiving two dimensional images. Bliss wondered if the reading rate would be higher if more than one column of 24 pixels were used, and if so, how many columns would be appropriate? Experiments with the computer simulation determined that reading rate increased dramatically up to 6 columns, which was a window width of about one letter space and this was about the maximum number of columns that could be placed on one finger. Jon Taenzer, one of Bliss’ Stanford graduate students, ran visual reading experiments on the same computer simulation and determined that for visual reading, reading rates continued to increase up to a window width of up to about 6 letter spaces. This led to a number of experiments toward trying to increase the tactile reading rate by increasing the number of columns in the tactile screen so more than one letter would be in view at a time. Instead of moving the text across only the index fingertip, tests were run with a screen wide enough for both the index finger and the middle finger to be used so two letters could be simultaneously tactually sensed. In another experiment the moving belt of text was run down the length of the fingers, rather than across them. The only approach that showed any promise of increasing the reading rate was when both index fingers were used, rather than the index finger and the adjacent middle finger. However, the use of both index fingers was incompatible with the design concept of using one hand to control the camera while the other hand sensed the tactile screen. The Optacon design was therefore based on an array of 24-by-6 pixels in both the camera retina and bimorph array.
Other questions had to do with the spacing between the tactile pins in the bimorph array and their frequency of vibration. It was well known from experiments reported in the literature that people could distinguish two points from one with their index finger when they were a millimeter apart. However these previous experiments had not been done with vibrating pins. What effect would the vibration have and was there an optimum vibration frequency? These questions were answered by experiments conducted by Charles Rogers,[7] a Stanford graduate student working with Bliss.
While the neurophysiological data suggested that the smallest two point thresholds would be at vibration frequencies less than 60 Hertz, Roger's experiments showed that the two point thresholds around 200 Hertz were actually smaller. Bliss hosted a conference at SRI,[8] including some leading neurophysiologists and psychophysicists, to try to resolve this discrepancy, but no one had an explanation. From a practical standpoint, Roger's result was very fortunate because the higher frequencies were required for refresh rates fast enough for reading up to 100 words per minute and for use of bimorphs small enough to construct a 24-by-6 array that fit on a fingertip.
The question of whether 144 tactile stimulators on a fingertip could be independently distinguished led to a confrontation at a scientific conference between Bliss and Frank Geldard, a University of Virginia professor. Geldard had written a major book on the human senses and was a leading researcher on using the sense of touch to communicate information. When asked how many tactile stimulators should be used in a tactile display, he maintained that no more than 8 tactile stimulators could be independently distinguished, and these should be on widely separated parts of the body. Bliss’ data showing useful reading with 144 stimulators on a fingertip appeared to be in conflict with Geldard's research. The difference was between communicating using two dimensional tactile images versus an 8-point code. Both Bliss and Geldard were reporting similar reading rates, but in the days before high accuracy optical character recognition, the Optacon approach was much more practical.
These experiments determined the design parameters for Optacon's man-machine interface: a 24-by-6 array of tactile stimulators, vibrating between 250 and 300 Hz, and with the rows spaced at 1 mm and the columns spaced at 2 mm (See Fig. 2).
In parallel with this human factors research was a pioneering effort to realize this design in a convenient portable unit, which would be critical for its success. In July 1972, Harry Garland suggested a new design for the Optacon that incorporated the sensor, tactile array, and electronics in a single hand-held unit. Roger Melen and Max Maginness developed a prototype of the unit, called the "one-hand" Optacon, at Stanford University.[9]
Optacon integrated circuit development
In the 1960s, when the Optacon was being developed, integrated circuitry was in its infancy, and no suitable integrated solid state arrays of photo detectors was available. The earliest complete Optacon-like reading aids were built at Stanford and SRI with a lens system that focused the images from the printed page on a fiber optic bundle with individual fibers connected to discrete phototransistors. Not only was this system large and bulky, it was expensive and difficult to assemble. An effort was launched to develop a monolithic silicon retina with an array of 24-by-6 phototransistors about the size of one letter space so simple optics with no magnification could be used. Basic research in integrated circuit technology available at that time had to be conducted, resulting in Ph.D. theses by several Stanford graduate students, including J. S. Brugler, J. D Plummer, R. D. Melen, and P. Salsbury. The phototransistors had to be sufficiently sensitive, fast enough for the required refresh rate, have a spectral response appropriate for detecting ink on paper, in a closely packed matrix without blind spots, and interconnected so only connections to the rows and columns were needed.
The successful fabrication of such a silicon retina was a major milestone toward a practical Optacon.
Optacon electronics, optics, and packaging
The first Optacon prototype using this retina was completed on September 1, 1969. It was portable and completely self-contained in that it combined the stimulator array, electronics, batteries, and camera in a single package measuring 13.5″ by 8″ by 2.25″. The total weight was 9 pounds. The low power electronics design in this unit was a joint effort by J. S. Brugler and W. T. Young which made possible about 12 hours of sustained operation from the rechargeable batteries. This unit included an improved optical system and camera plus a tactile bimorph driven screen, both developed by James Baer and John Gill at SRI.
As integrated circuit technology progressed, another custom integrated circuit was developed in the Stanford laboratories. This integrated circuit contained 12 bimorph drivers and interfaced between the 5 Volt circuitry and the 45 Volts required to drive the bimorphs. The incorporation of this circuit and the use of lower power components enabled the size to be reduced to 8″ by 6″ by 2″ and the weight to be reduced to four pounds. Again the team of Brugler, Young, Baer, and Gill were responsible for the design of electronics, optics, and packaging. The first Optacon incorporating these advances, Model S-15, was a significant milestone. It won an IR-100 Award as one of 100 best designed products in 1971 and was the prototype of the Telesensory Optacon. It is now at the Computer History Museum in Mountain View, California.
Optacon training
With a number of operational prototype Optacons available, an effort was made to get them in daily use by blind people in the community. The engineers were anxious to know how well the Optacon components held up in a real life environment, what uses were made of the Optacon, how much was it used, and how important was it in educational, vocational, and daily living. Several blind people in the Palo Alto community volunteered to participate, and Carolyn Weil was hired to coordinate, teach, and document this part of the project.
The first issue was how should a blind person be taught to read with and Optacon? Some blind people were unaware of letter shapes, and most were not familiar with the various type fonts. In addition spelling was usually not a strong point, since the education of blind students had often been in Braille, which has about 180 contractions. Of course, none were familiar with recognizing vibratory tactile images of letters moving across their index finger.
Weil developed lessons to teach recognition of letters presented in this fashion using both the computer simulation and the Optacon prototypes. It soon became apparent that while letter recognition could be taught in a few days, building reading speed was much more time-consuming. However, there were soon a number of blind people effectively using an Optacon prototype in their daily lives. These people contributed greatly to the project not only in providing important information for the design of future models, but also for motivating the Optacon development team toward making the Optacon widespread. Among this group of pioneering Optacon users were:
- Candy Linvill – John Linvill's daughter who was a Stanford undergraduate at this time. She used the Optacon in her studies. Once when her Optacon needed a repair, Bliss went to her dorm room to pick it up. She wasn't there so Bliss wanted to leave a message with her roommate. Her roommate told him “You can leave her a note if you print it like a typewriter and she can read it herself.” This was unheard of for a completely blind person.
- Sue Melrose – Another blind Stanford undergraduate who was taught to read with an Optacon by Candy Linvill. Both Sue and Candy participated in many Optacon presentations at conferences and meetings.
- Bob Stearns – A blind computer programmer working at SRI. Bob used the Optacon in his work writing and de-bugging computer programs.
- Loren Schoff – Another blind Stanford student who initially used the Optacon in his studies. In his mathematics textbooks he would have Braille transcribers put the text in Braille but he would read the equations and graphs with the Optacon. After graduation he was hired by SRI as a data analyst on the project. He did an important statistical analysis showing the relation between age and the Optacon reading speed achieved after a certain amount of time. Hewlett-Packard had just announced their pioneering HP-35 hand held calculator. He did this analysis using the Optacon to read the screen of the HP-35 calculator.[10]
From commercialization to discontinuance
The Optacon was manufactured and marketed from 1971 to 1996 by Telesensory Systems Inc. of Silicon Valley, California. Throughout the 1970s and into the 1980s, the Optacon underwent upgrades, including the development of a new model, known as the Optacon II, which featured improved capabilities to interface to a computer.
As the Optacon project progressed and more obstacles and unknowns were overcome, the importance of making the Optacon generally available was apparent. TeleSensory's initial sales were to provide Optacons for test evaluations for the U.S. Office of Education, St. Dunstan's for blinded veterans in London, England, the Berufsbildungswerk in Heidelberg, Germany, and Sweden. The success of these evaluations led to larger dissemination programs funded by the U.S. Department of Education, private U.S. foundations such as Melen and Pew, state Departments of Rehabilitation, and various programs in many countries around the world such as Japan, Italy, Germany, France, and Scandinavia. The number of Optacons purchased privately by individuals was small. Approximately 15,000 Optacons were eventually sold.
Throughout the 1970s and into the 1980s, the Optacon underwent upgrades, and various accessories were added, including different lens modules to be used with the camera for reading text in a typewriter and on computer and calculator screens. In 1985 Canon Inc. and Telesensory cooperated in the development of the Optacon II, which featured improved packaging and capabilities to interface to a computer (See Fig. 3).
The design decision to reduce the number of image pixels from 144 to 100 to lower cost resulted in Optacon II not being successful.
In the 1990s Telesensory increasingly shifted its emphasis toward the low-vision market and became less devoted to the Optacon. Page scanners with optical character recognition had come to be the tool of choice for blind people wanting access to print. Page scanners were less expensive and had a much shallower learning curve than the Optacon. In addition, blind people could generally read through material more quickly with a page scanner than with an Optacon.
In 1996 Telesensory announced that it would no longer manufacture the Optacon and that it would cease to service the device in 2000. Many users purchased used machines and cannibalized them for parts, presumably with much help from sighted, electromechanically-talented friends. In March 2005, TSI suddenly shut down. Employees were "walked out" of the building and lost accrued vacation time, medical insurance, and all benefits. Customers could not buy new machines or get existing machines fixed. Some work was done by other companies to develop an updated version of the Optacon to reduce the cost of the device and take advantage of newer technology, but no device with the versatility of the Optacon had been developed as of 2007.
Many blind people continue to use their Optacons to this day. The Optacon offers capabilities that no other device offers including the ability to see a printed page or computer screen as it truly appears including drawings, typefaces, and specialized text layouts.
References
- ↑ L.H. Goldish and H.E. Taylor, "The Optacon: A Valuable Device for Blind Persons", NEW OUTLOOK FOR THE BLIND, published by the American Foundation for the Blind,Feb. 1974, pp. 49-56
- ↑ Smith, Joel (November 12, 1973). "Device gives blind chance to read without braille". The Detroit News. p. 3-B. "It opens up a whole new world to blind people. They aren't restricted anymore to reading material set in braille."
- ↑ J.G. Linvill and J.C. Bliss, "A Direct Translation Reading Aid for the Blind", Proceedings of the IEEE, Vol. 54, No. 1, Jan. 1966, pp. 40-51
- ↑ E.E. Fournier d'Albe, "The Moon Element", published by D.Appleton and Company,1924, pp. 112-113 - https://archive.org/details/moonelement002067mbp/page/n127
- ↑ "Optaconmovies". https://www.youtube.com/optaconmovies.
- ↑ J.C. Bliss, "A Relatively High-Resolution Reading Aid for the Blind", IEEE Transactions on Man-Machine Systems, Vol. MMS-10, No. 1, March 1969, pp. 1-9
- ↑ C.H. Rogers, "Choice of Stimulator Frequency for Tactile Arrays" IEEE Transactions on Man-Machine Systems, Vol. MMS-11, No. 1, March 1970, pp. 5-11
- ↑ Special Issue, IEEE Transactions on Man-Machine Systems, Vol. MMS-11, No. 1, March 1970,
- ↑ Linvill, John G. (March 1973). Final Report: Research and development of Tactile Facsimile Reading Aid for the Blind. Stanford Electronics Laboratories. pp. 24–25. https://archive.org/stream/OptaconFinalReportJohnLinvill1973/OptaconFinalReport-JohnLinvill-1973#page/n29/mode/2up. Retrieved 6 November 2017. "Dr. Garland suggested that the Optacon could be more effective if the camera and tactile screen were incorporated into a single hand-held unit."
- ↑ Dow, Valerie (December 6, 1973). "Blind Students Succeed With Extra Time, Effort". The Stanford Daily 164 (53): 6. https://stanforddailyarchive.com/cgi-bin/stanford?a=d&d=stanford19731206-01.1.6&e=-------en-20--1--txt-txIN-------#. Retrieved 3 October 2017.
External links
- "The Optacon" by John G. Linvill - Final Report submitted to the Office of Education, March 1973
- "From Optacon to Oblivion: The Telesensory Story" as published in AccessWorld magazine, July 2005
- "The Reading Machine That Hasn't Been Built Yet" as published in AccessWorld magazine, March 2003
- "The Optacon: Past, Present, and Future" from the Braille Monitor, a publication of the National Federation of the Blind
- optacon-l e-mail list for Optacon users and researchers
- Optacon documentation and training materials page at the Freedom Scientific web site
Original source: https://en.wikipedia.org/wiki/Optacon.
Read more |