Japanese researchers have used a non-invasive technique for decoding brain activity to make a robot mimic actions being made by a human.
Advanced Telecommunications Research Institute International (ATR) and Honda Research Institute Japan (HRI) have collaboratively developed a new Brain Machine Interface (BMI) for manipulating robots using brain activity signals.
The BMI technology decodes natural brain activity and uses the extracted data for the near real-time operation of a robot without surgery to the head and brain. The researchers said this breakthrough offers greater possibilities for new types of interface between machines and the human brain.
The BMI technology is based on an article titled “Decoding the perceptual and subjective contents of the human brain” by Dr. Yukiyasu Kamitani, a researcher at ATR Computational Neuroscience Laboratories. HRI and ATR developed the theory into a system for real-time brain activity decoding and robotic control.
This research reveals that MRI-based neural decoding can allow a robot hand to mimic the subject’s finger movements by tracking where the blood flows in the brain when the subject makes the movement, known as the haemodynamic responses. The robot hand mimicked a human making “scissors, paper, stone” shapes. Although there is an approximate seven-second time lag between the subject’s movement and the robot’s mimicking movement, the researchers succeeded in gaining a decoding accuracy of 85 per cent.
This technology is potentially applicable to other types of non-invasive brain measurements, such as the brain’s electric and magnetic fields and brain waves. By using these methods, the researched expect that the same result could be achieved with less time lag and more compact BMI system devices.