An autonomous wheelchair can learn about the locations in a building and then take its occupant to a given place in response to a verbal command.
Because it can simply be told where to go, the wheelchair user is able to avoid the need for controlling every twist and turn of the route and can simply sit back and relax as the chair moves from one place to another based on a map stored in its memory.
‘It’s a system that can learn and adapt to the user,’ says Nicholas Roy, assistant professor of aeronautics and astronautics at Massachusetts Institute of Technology (MIT) and co-developer of the wheelchair.
Unlike other attempts to program wheelchairs or other mobile devices, which rely on an intensive process of manually capturing a detailed map of a building, the MIT system can learn about its environment in much the same way as a person would, by being taken around once on a guided tour, with important places identified along the way.
Also collaborating on the project are Bryan Reimer, a research scientist at MIT’s Agelab, and Seth Teller, professor of computer science and engineering and head of the Robotics, Vision, and Sensor Networks (RVSN) group at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL).
The wheelchair prototype relies on a WiFi system to make its maps and then navigate through them, which requires setting up a network of WiFi nodes around the facility in advance.
After months of preliminary tests on campus, they have begun trials in a real nursing home environment with real patients, at the Boston Home in Dorchester, a facility where all the nearly 100 patients have partial or substantial loss of muscle control and use wheelchairs.
As the research progresses, Roy says he’d like to add a collision-avoidance system using detectors to prevent the chair from bumping into other wheelchairs, walls or other obstacles.
Teller says he hopes to add mechanical arms to the chairs, to aid the patients further by picking up and manipulating objects – everything from flipping a light switch to picking up a cup and bringing it to the person’s lips.
The research was funded by Nokia and Microsoft.
Nicholas Roy, left, assistant professor of aeronautics and astronautics, and Seth Teller, professor of computer science and electrical engineering, stand next to the robotic wheelchair they co-designed, which can be navigated by vocal command