The human hand is a fiendishly complicated mechanism whose control system is even more obscure. Engineers designing prosthetic hands have for centuries struggled to help the users move them in a natural way. The best technology until now has involved picking up electrical signals from muscles in an amputee's remaining stump and using those to control motors in the wrist, finger and thumb joints of the prosthetic. Although a major step forward from the primitive systems of unpowered hands, this is still not an ideal solution because users have to learn which muscle twitches to use, and the hand’s systems also have to be taught what the patterns of twitches are meant to mean.
In tests, the prosthetic hand was used to mimic the movements of an able-bodied volunteer. Image: Lizhi Pan, NC State University
Research from the joint biomedical engineering program at North Carolina State University and the University of North Carolina at Chapel Hill promises to make using a prosthetic hand a much more natural experience. Relying on computer models that mimic the behaviour of natural structures in the human forearm, the researchers, led by Prof He (Helen) Huang, a biomedical engineer, have developed a generic musculoskeletal model that takes the place of an amputee's missing muscles, joints and bones to generate control signals for the prosthetic.
Unlike existing myoelectric control systems, this technology does not rely on machine learning to generate control algorithms. "Every time you change your posture, your neuromuscular signals for generating the same hand/wrist motion change," Huang explained. "So relying solely on machine learning means teaching the device to do the same thing multiple times; once for each different posture, once for when you are sweaty versus when you are not, and so on. Our approach bypasses most of that."
Huang's team recruited six able-bodied volunteers, and placed myoelectric sensors on their forearms. These tracked exactly which neuromuscular signals were sent when they performed various actions with their wrists and hands. These data were then used to create the generic musculoskeletal model.
"When someone loses a hand, their brain is networked as if the hand is still there," Huang said. "So, if someone wants to pick up a glass of water, the brain still sends those signals to the forearm. We use sensors to pick up those signals and then convey that data to a computer, where it is fed into a virtual musculoskeletal model." In tests so far, the model has been able to control a prosthetic arm and wrist "in a coordinated way and in real time - more closely resembling fluid, natural motion," Huang claimed.
In early testing, both able-bodied and amputee volunteers were able to use the model-controlled interface to perform all of the hand and wrist motions the team decided to test, despite having very little training. Before beginning full clinical trials, the team is looking for trans-radial amputee volunteers (who have lost an arm between the elbow and the wrist) to test how well the technology copes with activities of daily living.
"To be clear, we are still years away from having this become commercially available for clinical use," Huang said. "And it is difficult to predict potential cost, since our work is focused on the software, and the bulk of cost for amputees would be in the hardware that actually runs the program. However, the model is compatible with available prosthetic devices."
The use of the device is not limited to prosthetics. Huang also suggests that it might be useful in developing computer interfaces for able-bodied people, which might be used for computer gaming or manipulating objects in CAD programs.
The team described as research in a paper in the journal IEEE Transactions on Neural Systems and Rehabilitation Engineering. Among the funders of the research were DARPA and the US National Institute for Disability, Independent Living, and Rehabilitation.
See the hand mimic the movements of an able-bodied volunteer in this video, courtesy of Lizhi Pan, NC State: