Researchers at the University of Pittsburgh have demonstrated that a monkey can feed itself with a robotic arm simply by using signals from its brain, an advance that could enhance prosthetics for people, especially those with spinal cord injuries.
The robotic arm, or neural prosthesis, is about the size of a child’s arm and moves like a natural arm, with a fully mobile shoulder and elbow and a simple gripper that allows the monkey to grasp and hold food while its own arms are restrained.
The arm is wired into the monkey’s brain and intercepts signals through electrodes attached to tiny probes that tap into neuronal pathways in the motor cortex, a region of the brain responsible for voluntary movement. The neurons collective activity is fed through an algorithm developed at the University of Pittsburgh and then sent to the arm to tell it what direction to go.
“This is a breakthrough in the development of neural prosthetic devices that will someday lead to devices that could help people who are paralysed or who have lost limbs,” said Andrew Schwartz, Ph.D., professor of neurobiology at the University of Pittsburgh School of Medicine and senior researcher on the project.
According to Dr. Schwartz, a part of the brain that controls movement, the primary motor cortex, contains neurons that fire like a Geiger counter in different directions. The direction to which a neuron fires fastest is called its “preferred direction.” Many motor cortical cells change their firing rate for each movement, and this activity from the many neurons is routed through the spinal cord to different muscle groups to generate movement.
It takes thousands of neurons firing in concert to allow even the most simple of movements, and it would be impossible to tap into all of them, so the Pittsburgh team developed an algorithm to fill in the missing neuron signals, allowing them to get a useable signal from a manageable number of electrodes. The algorithm they developed to decode the cortical signals acts like a voting machine by using each cell’s preferred direction as a label and taking a continuous tally of the population throughout the intended movement.
Monkeys were trained to reach for targets, and once the electrodes were in place, the algorithm was adjusted while the arms were restrained to assume the animal was intending to reach for targets.
“Each cell is movement-sensitive and has a preferred direction, and each cell’s preferred direction is like a vote,” said Chance Spalding, a bioengineering graduate student in Dr. Schwartz’s lab who presented the findings. “When all of the votes are added up it gives us the population vector.” These population vectors accurately predict the velocity of normal arm movement, and in the case of this prosthetic, serve as the control signal to convey the monkey’s intention to the prosthetic arm.
Because the software had to rely on a small number of the thousands of neurons needed to move the arm, the monkey did the rest of the work, learning through biofeedback how to refine the arm’s movements by modifying the firing rates of the recorded neurons.
For the task, food was placed at different locations in front of the monkey, and the animal, with its own arms restrained, used the robotic arm to bring the food to its mouth.
“The next step with this device is to add realistic hand and finger movement,” said Meel Velliste, Ph.D., a postdoctoral fellow in the Schwartz lab. “This presents quite a challenge because there are hundreds of different subtle movements we make with our hands and we will need to interpret all of them.”