University of Florida researchers have combined speech recognition software, wearable computers, satellite positioning technology and other emerging technologies to create a navigational aid for the blind.
Composed of a waist-worn computer and headset connected remotely to a map database server, the prototype reportedly delivers and responds to instructions verbally. It keeps track of the user’s location while giving directions to a destination – and may even warn the user against veering off a pavement or stepping into a road.
‘When we started this project, we were looking for a compelling mobile application of wearable computing that would be not just for fun from a research perspective, but also useful to society,’ said Steve Moore, who designed the system.
Computer engineering Professor Sumi Helal and civil and coastal engineering doctoral student Balaji Ramachandran also helped with the project, which the researchers dubbed DRISHTI, after the Sanskrit word for vision.
While in the early stages, the system is said to be a promising attempt to address the problem of helping the blind get around in a world designed for sighted people.
Speaking into the microphone, the user tells the system his location and where he or she wants to go. The system responds with directions based on the user’s starting point, saying, for example, to turn 15 degrees and walk along a pavement for 230 feet. If the user veers off the pavement or travels too far, the system provides a verbal correction. It also may warn against impediments or hazards.
To achieve such contextual real-time directions, the system relies on numerous hardware and software components, both mobile and fixed.
In addition to the headset, a blind person using the system carries a small, commercially available personal computer, which contains voice-recognition and other software. The user also carries a cell phone for wireless communication, an antenna and a backpack containing a Global Positioning System, receiver, batteries and other equipment.
Housed in a lab in UF’s computer engineering building, the database server holds a Geographic Information System (GIS) database of the UF campus. Far too immense to fit onto the wearable computer, the database contains the latitudes and longitudes of thousands of points of reference on campus, from pavements to buildings to streets. It also can be easily updated to include construction activities or other temporary landscape changes on campus.
The system matches the user’s location – obtained using Global Positioning System technology – with the information provided by the database server in real time. The voice-recognition/wearable computer provides the user interface for the data.
A demonstration revealed the promises of the system as well as some of its challenges. On one hand, the system provided specific directions as requested and communicating with the computer by voice was straightforward despite its limited vocabulary. However, its style – more command and response than conversational – took some practice. Additionally, because the GIS database consists only of the UF campus, the current system could not be used outside the university. But in the future, Moore said, similar GIS databases could be accessible for use in many other locations.
‘What you would like is to be able to offer this as a service,’ Moore said. ‘You go to a city, and say, ‘OK, I need to be navigated,’ and it taps into the GIS database for that city.’
‘If you’re out there all alone with only a cane, you can make a little turn here and there, and first thing you know you have no idea which way is west,’ said Theral Moore, UF professor of maths who is visually impaired. ‘But if you get directions from the voice telling you what direction you’re going, how far off course you are, and/or if you are leaving the middle of the pavement, you just feel much more comfortable.’
Steve Moore hopes to develop the system into a commercial product in the next two years.