Robot’s view helps people with profound motor impairment

People with profound motor impairments could feed themselves and perform routine personal care with an augmented reality interface that operates an assistive humanoid robot.

The web-based interface is said to display a "robot's eye view" of surroundings to help users interact with the world through the machine.

The system could help make sophisticated robots more useful to people who do not have experience operating complex robotic systems.

Study participants interacted with the robot interface using standard assistive computer access technologies - such as eye trackers and head trackers - that they were already using to control their PCs.

A paper published in PLOS ONE reported on two studies showing how such "robotic body surrogates" - which can perform tasks similar to those of humans - could improve the quality of life for users. The work could provide a foundation for developing faster and more capable assistive robots.

"We have taken the first step toward making it possible for someone to purchase an appropriate type of robot, have it in their home and derive real benefit from it," said Phillip Grice, a recent Georgia Institute of Technology PhD graduate who is first author of the paper.

Register now to continue reading

Thanks for visiting The Engineer. You’ve now reached your monthly limit of news stories. Register for free to unlock unlimited access to all of our news coverage, as well as premium content including opinion, in-depth features and special reports.  

Benefits of registering

  • In-depth insights and coverage of key emerging trends

  • Unrestricted access to special reports throughout the year

  • Daily technology news delivered straight to your inbox