Augmented reality in app for visually impaired
Cydalion surveys areas in front of an array of sensors and emits a series of tones corresponding to the location and distance of obstacles, with different tones for objects to the right, left or center, high-pitch sounds for head-height barriers, and low-pitch emissions for tripping hazards. The tones vary in frequency according to the distance of the objects from the device.
The app’s sounds can be customized, and a display on the screen visually represents warning areas with options for color-blind individuals. Float recommends using a vest or lanyard to hold the device perpendicular to the ground at chest level and bone conduction headphones that emit tones without blocking ambient sound.
Matt Forcum, Float’s lead designer of the app, said early iterations of the program included spoken feedback, but the verbal warnings proved less intuitive than tones in stereo sound.
“We were really kind of surprised to find that when we took that out completely, people picked up on it really quickly,” Forcum said.
Float partnered with Illinois State University’s College of Education and Student Access and Accommodation Services, as well as the student organization Braille Birds, to test the app and receive feedback.
A hardware system, called Tango, with two camera lenses, an infrared sensor and supporting software developed by Google gives apps such as Cydalion the ability to detect distances between devices and objects in the environment in real time as the device moves.
“The tools really haven’t changed in 100 years,” Forcum said. “It’s a cane and a dog, essentially.”
See the full story here: http://www.washingtontimes.com/news/2016/dec/4/exchange-augmented-reality-in-app-for-visually-imp/
Pages
- About Philip Lelyveld
- Mark and Addie Lelyveld Biographies
- Presentations and articles
- Tufts Alumni Bio