New Tools for the Blind
Engineers creating technology to help blind people navigate the world must overcome myriad challenges, as Susan Young learns. Illustrated by Sarah Chen and Emily M. Eng.
|Illustration: Sarah Chen
Brent Gifford is searching for a sign—nothing supernatural, just a small picture that looks like a color wheel with only four colors. It should be somewhere in this room. He pulls out his smart phone, adjusts a few settings, and holds it in the center of his chest, camera facing out. He begins rotating the phone side to side, scanning the room.
Gifford is blind, and it's up to the phone to spot the sign. He's testing an experimental application for smart phones that could one day help blind people navigate through unfamiliar territory. By using their camera phones and some strategically placed graphic signs, the visually impaired could find bathrooms in buildings they've never visited or learn which bus routes come to a particular stop.
The system is one of the projects in the Assistive Technology Lab at UC Santa Cruz. Led by computer engineers Roberto Manduchi and Sri Kurniawan, the lab creates tools to help people with disabilities. Many of the tools are geared toward helping the blind and visually impaired. In addition to the mobile navigator, the lab's researchers are developing computer programs to help blind people prepare documents that meet business standards, and even a module so the blind can rock out with Rock Band.
The engineers ask visually impaired community members to test and critique their prototypes. Gifford, a professional magician and a clinical hypnotherapist, is one of their more experienced testers. He's thoroughly examined about a dozen devices for different companies, and he's put his hands on many more.
The mobile navigator, he says, feels like a winner: "I don’t think there's anything close to this particular product right now."
The Great Blindini
As a teenager in the late 1970s, Gifford tested one of the first optical character recognition (OCR) systems for Kurzweil Technologies. "I was a guinea pig," says Gifford, "the blind guy at the end of the chain." He continued to work for Kurzweil, traveling all over the country to teach others how to use the then-bulky OCR machine. He even spoke before U.S. Senate subcommittees, testifying about why the government should fund a machine that could read to blind people.
Gifford became blind at age 11 when his retinas detached, the result of a genetic disorder that prevents connective tissue from forming. By age 12, he started doing magic, tricking the eyes of his seeing audience. He wanted to take on the most visual performance art he could think of, and magic—with its misdirection, optical illusion, and face reading—fit the bill. At 16, he adopted the stage name "The Great Blindini." Now 51, Gifford performs for children and adults around central and northern California.
Gifford attributes his popularity as a technology tester to two things: He can write a coherent evaluation ("as opposed to 'it doesn't work, but I don't know why,'" he jokes), and he's a trained magician. In that role, he often thinks, "That's impossible, how am I going to do it?" when designing tricks. Similarly, many assistive tools may have at one time seemed out-of-this-world. But now handheld readers can pronounce text on a page, and cell phones with a camera and the right assistive app can identify a jar of Skippy peanut butter.
Follow that phone
The engineers in the Assistive Technology Lab aim to harness such ever-improving smart-phone cameras and processors to help the visually impaired find their way through unfamiliar territory. White canes and guide dogs can help users avoid curbs and couches, but they can't reveal which door leads to the dentist's office. And while GPS is great for trips from home to the center of town, it doesn't offer enough resolution to help blind people find the right door to knock on—and it doesn't work indoors.
For instance, when a blind person goes to a new hospital, how can she find the right doctor's office or a bathroom without resorting to asking strangers for help? You would like a system to give turn-by-turn directions, says Manduchi. To do that, you need two things: a map, and a way to understand where you are.
His proposed solution is a system of colorful "landmarks" that a camera phone would recognize. He and his colleagues have developed smart phone programs that pick out landmarks with color patterns that correspond to a particular location. In the same way that a lost sighted person looks for directional signs or labels on doors, a blind person would use this system to figure out where to go next.
"Room 111, to the left," imagines Manduchi.
Photo: Susan Young
UC Santa Cruz computer engineer Roberto Manduchi with a "landmark," which blind users will find with cellphone cameras to help them navigate.
The landmarks look like four-slice pies of four different colors. The arrangement of colors encodes each landmark's identity. Ideally, the phone's memory will already contain a building map. Then, the phone would provide audio directions (e.g., “turn left”) to guide the user.
"The map is something that theoretically you can get, but it's not that easy," Manduchi says. "Not all places give away maps."
As a user scans the room, the software looks for a pattern of four pixels spaced like the four corners of a square. If the phone finds four pixels that match the four-color pattern of a landmark, it runs more tests to check that the object is pie-shaped. The system works for landmarks close to the camera's lens or on a distant wall.
The researchers chose colors that stand out against many backgrounds and in different lighting conditions. Initially, they used red, blue, black and white landmarks from desktop printers. But now the group makes green, orange, black and white landmarks by gluing together colored papers with a matte finish, making phone detections more reliable.
Making sense of a live video feed is a challenge, too. The images are often blurry, especially if users move the phone too quickly when scanning a room. Manduchi's group is working to better process such less-than-ideal data by assessing the angle and position of the phone. Those details come from a smart phone's internal compass, gyroscope, and accelerometer.
A problem of perception
Underlying these engineering issues is the foremost task: designing useful and practical tools that meet the particular needs of the impaired users. "Very often, sighted people who are creating technology for blind and visually impaired folks are operating out of the best intentions and out of true altruism," says Brad Hodges, a blind national technology program associate with the American Foundation for the Blind in New York. "That being said, those developers are influenced by general societal attitudes about blindness—some of which are correct, and some of which are very much in error."
Manduchi, who moved to UCSC in 2001 after working in the machine vision field for Apple and NASA, is familiar with tailoring projects to specific audiences. Although he now knows how important it is to get user input, he acknowledges that he didn't always do so. One of his first projects at UCSC was a "virtual white cane," a hand-held laser device that measured distances between users and objects in their path. But after talking with lots of blind people, he decided that other directions of research would be more promising.
"Blind people who use a white cane usually love their cane," he says. "If you try to substitute their cane with something more technological, you're going to fight an uphill battle. The cane is simple, the cane is cheap, it doesn’t break, it doesn't have batteries, it tells when you are about to trip onto something, it tells you if there is a drop-off. So the cane is quite a good device."
Part of the challenge is that the audience of potential users is diverse. There are 1.3 million legally blind people in the U.S., according to the National Federation of the Blind, with varying degrees of impairment. Fewer than one of every 1,000 people is totally blind. Just like the rest of the population, the visually impaired vary in many ways: the type of town they call home, their lifestyle preferences, whether they have additional disabilities. That means a given assistive device will have fewer potential users than raw population numbers would suggest.
Furthermore, blind people generally aren't affluent. Some 70% are unemployed, according to the American Foundation for the Blind. The group's small size also makes affordable commercialization more difficult. A small number of products must absorb the costs of research and development.
Write the right way
But there are success stories. Some of the best tools for the visually impaired help users read printed and electronic documents. Scanners fitted with OCR software can read the text on sheet of paper aloud to a user, while screen-reader programs give voice to digital words on a computer screen.
But what if a blind person must create a digital document for sighted people to read? While screen readers allow blind people to listen to the words they write, they don't alert users to a switch in capitalization or right-justification in a paragraph. These formatting changes could be a mistake—and might cause trouble for the blind author.
"Even though the document recipient knows you are blind, you are still expected to have a certain level of quality formatting and layout," says Kurniawan, who came to UCSC in 2008.
While working at the University of Manchester in the U.K. in 2003, Kurniawan developed a program that describes the formatting and layout of a completed document using speech. Now, with a recent grant from the National Science Foundation, Kurniawan plans to revamp the program to communicate formatting changes in real time with sound effects instead of words. With that upgrade, users will avoid the tedious process of checking a document after it's been written.
Because the text will get priority for words that the user hears, the program will not confuse the user by describing the formatting in speech. Rather, it will use sounds and changes in pitch to communicate formatting changes. For example, tinkling piano keys will indicate that a font is italicized. A higher-pitched voice will denote a word in all capital letters.
Technology adviser Hodges notes that even a program that checks the formatting of a completed document has potential for wide-reaching success. Large groups could benefit, such as federal agencies that require particular formatting styles, he suggests. Such a program could be adapted so that these groups could check huge numbers of documents, written by and for sighted users, for proper style.
It goes both ways
Hodges points to many examples of accommodations developed to help the disabled, such as ramps in airports, that were later widely adopted. Even the commonplace keyboard, the offshoot of the typewriter, has roots in an assistive technology: "In the 19th century an Italian fellow invented the typewriter so his lady consort could write to him. She was blind," says Hodges. Writing by quill and ink was a challenge for the blind, and the tool to solve that problem was later adapted universally. And in the other direction, today's mainstream products—such as the iPhone—now come out-of-the-box with accessibility options for the visually impaired.
Following that beat, Kurniawan has developed Rock Vibe, a module that enables a gamer to play Harmonix's Rock Band video game without looking at musical cues scrolling on the TV screen. Instead, a blind user straps on five vibrating strips: two on the wrists, two on the upper arms, and one on an ankle. When a particular strip vibrates, the virtual rock star hits the corresponding drumhead.
"I'd heard so much about [Rock Band] but had never been able to play it because it's completely visual," says Caitlin Hernandez, a blind UCSC student who tested Rock Vibe. "I thought it was really creative that they thought of incorporating something tactile to make it accessible."
There are many online games for blind people, Hernandez says, but "they're sort of pale imitations of popular video games." Rock Vibe is unique. "I never heard of a [mainstream] game being adapted. Everybody plays Rock Band; it's a huge party game," she says. "I thought it was interesting that they were trying to adapt it as opposed to making something that was similar to it, which is what I feel like the other games on the market for the blind are doing."
Finding the way
Back at the testing site, Brent Gifford's phone has detected a colored pie-shaped landmark.
"Four meters ahead," declares a metallic voice. He walks across the room and reaches the landmark.
The mobile navigator is one of the few technologies Gifford has tested that could succeed, he says. "It has the potential to be useful to the majority of blind people, because cell phones are part of everyday life."
Indeed, smart-phone-based technologies may be the way to go for many assistive tools. They're becoming more common among the visually impaired, and the iPhone's accessibility options are coaxing more blind customers toward Apple. The greatest advantage of developing assistive technologies for smart phones is that the end users probably own one already. New games based on smart phones may reach other users in need, too [see sidebar: iHealth Help].
Manduchi's navigation would require some changes to infrastructure, such as putting up enough landmark signs in buildings. However, those kinds of mandates are possible under the American with Disabilities Act of 1990. Taking a cue from Gifford, another young blind person may one day speak before a U.S. Senate subcommittee, testifying why it's time to truly help the blind make their own way through the world.
Sidebar: iHealth Help
Computer engineer Sri Kurniawan thinks assistive technologies shouldn’t be limited to people with physical or cognitive disabilities. "If somebody is illiterate, they are not traditionally disabled, but it is a disabling condition," she says. Likewise, if people can't afford insurance, they might have a lower quality of life. She thinks human-computer interactions can help these disadvantaged groups.
One of Kurniawan's graduate students at UC Santa Cruz has created a mobile-phone-based game to motivate teenagers to get up and move. "The idea is to focus on teenagers so that they are still at an age where they could change their habits," says Sonia Arteaga, the game's designer.
|Illustration: Emily M. Eng
One game features lots of obstacles. The accelerometer of an iPhone can act as a pedometer. When a high schooler takes a certain number of steps, the game will present an obstacle—a cloud of mosquitoes, for instance. The player must swat them away by waving her phone around to continue through the game.
Another game presents a find-the-object picture puzzle each time the user walks a certain number of steps. To collect points and try new puzzles, the user must keep moving.
Arteaga says teenagers who have tried the games like the challenges. They want to try harder to get to levels they couldn't reach the first time around. She hopes to add features that will allow users to do challenges with friends, because the social aspects of gaming ranked high with teenagers in her user studies.
Another of Kurniawan's students, Alexandra Holloway, is working on a different health angle: preparing expecting parents for childbirth.
|Illustration: Emily M. Eng
In her game, called "The Prepared Partner," a woman in labor starts the action. The player must try to boost her physical health, perhaps by feeding her a sandwich, or reduce her stress, such as with meditative chants. If her vital stats get too low, a doctor will come to perform a Caesarean.
"The target audience is low-income and low-education women and their partners," Holloway says. That demographic is less likely to attend childbirth classes and more likely to have costly interventions and unnecessary C-sections. "They are also less likely to go out and get childbirth books and sit down and read 500 pages of how to help a woman in labor. But they are likely to play games," she notes.
In an early test, players performed better on a quiz about labor and comforting a woman during contractions after they had played The Prepared Partner. For now, the game is available online, but Holloway plans to create a version for smart phones.
Story © 2011 by Susan Young. For reproduction requests, contact the Science Communication Program office.
B.S. (molecular biology) University of Texas at Austin
Ph.D. (molecular biology) University of California, Berkeley
Internship: Stanford Medical School news office
“Professional students” get a bad rap, but this never diminished my dream of being one. The seed was planted in the burgeoning bookshelves of my childhood, where I happily spent hours reading about horses, flea bites, and static electricity. This fascination with the natural world carried me to graduate school, where I studied the early evolution of animals. While there I found myself covertly reading about other branches of science, feeling guilty that my attention strayed off topic. Three years in, I uncovered the way to justify these mental wanderings and achieve my long-held ambition. I started writing for a campus science magazine, feeding my yen for new discoveries. I realized, then, that my indulgent explorations could be the key to my dream job, one that starts with a serious devotion to learning.
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
B.S. (Health Sciences) University of California, Santa Cruz
Internship: Division of Acute Care Surgery, LAC + USC Medical Center
My academic background is primarily rooted in biology, and my main interests lie in human physiology. I have a longstanding interest in diagrammatic illustrations that detail medical research and developments on public health. However, my fascinations are not limited to health science—animals, plants, and the unexpected also inspire my work. I am fond of observing repeated designs in nature, and strive to integrate engaging illustrations with scientific news. I enjoy cats, the color red, and X-acto knives.
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Emily M. Eng
B.S. (biology) Santa Clara University
Internship: California Academy of Sciences, San Francisco
Emily M. Eng, a freelance artist, worked across the country as a naturalist and outdoor educator before turning to illustration. She also studied biology, environmental studies, studio art, and religious studies as an undergrad. Emily’s passion to make science accessible drives her illustrations today. She can currently be found painting nudibranchs, designing infographics, and spoiling her dog Bean. Please check out her website for her latest illustrations or email her for her latest stories about Bean dog.