Non-invasive brain tech controls robotic arm

Researchers at Carnegie Mellon University have developed a non-invasive brain-computer interface (BCI) to control a robotic arm.

BCIs have exhibited much success in recent years, translating the brain’s activity into robotic manoeuvres and showing potential to assist disabled people with everyday tasks. However, measuring brainwaves for precision control has, until now, only been possible by surgically placing implants in the brain, which brings issues of cost and patient risk.

BCIs that use non-invasive external sensing receive ‘dirtier’ signals, and as such can not be used to exert the same level of control as brain implants. The Carnegie Mellon team, working in collaboration with the University of Minnesota, used novel sensing techniques combined with machine learning to improve the neural decoding of EEG (electroencephalogram) signals. These improvements facilitated the real-time continuous control of a robotic arm in two dimensions, smoothly following a cursor around a screen. The research is published in Science Robotics.

“This work represents an important step in non-invasive brain-computer interfaces, a technology that someday may become a pervasive assistive technology aiding everyone, like smartphones,” said Bin He, head of Carnegie Mellon’s Biomedical Engineering Department.

Register now to continue reading

Thanks for visiting The Engineer. You’ve now reached your monthly limit of news stories. Register for free to unlock unlimited access to all of our news coverage, as well as premium content including opinion, in-depth features and special reports.  

Benefits of registering

  • In-depth insights and coverage of key emerging trends

  • Unrestricted access to special reports throughout the year

  • Daily technology news delivered straight to your inbox