New gesture recognition strategy advances prosthetic hand technology

Researchers from the Beijing Institute of Technology and The University of Electro-Communications in Tokyo have developed a new method to improve the functionality of bionic prosthetic hands.


The novel method can recognise hand gestures in prosthetic devices using electromyography (EMG) signals, a technique that captures the electrical activity produced by skeletal muscles.

Specifically, the researchers used a new strategy called ‘Virtual-Dimension Increase of EMG,’ which allows the EMG system to interpret the user's intent more accurately without the need for additional physical sensors.

Their approach increased the number of virtual EMG signal channels, enhancing the system's ability to discern subtle differences in muscle activity and thus predict a wider range of hand gestures.

The researchers said that while electromyography has long been a cornerstone in the development of intelligent bionic prostheses, by translating muscle activity into data that can drive the movements of a prosthetic hand, traditional systems require increasing the number of sensors to capture more data which can complicate the device without necessarily improving performance.

“By virtually increasing the number of EMG channels, we can enrich the motion intention information extracted, avoiding the pitfalls of system overcomplexity and maintaining user comfort," lead researcher Yuxuan Wang, from the School of Mechatronical Engineering, Beijing Institute of Technology, said in a statement.

Through multiple rounds of testing, the method showed a ‘notable improvement’ in gesture recognition accuracy. The researchers said they hope their method can not only  make prosthetic hands more responsive but also more accessible, as it reduces the need for extensive sensor arrays which can be bulky and expensive.

In addition, the research team also introduced a new quantitative measure called ‘separability of feature vectors’ (SFV), which can predict classification success before actual gesture recognition takes place. According to the team, this metric is crucial for assessing the potential of different gesture recognition setups and ensuring that the prosthetic hand is finely tuned to the user’s individual needs.

Looking ahead, the researchers hope that, as their method develops, it could lead to more intuitive and accessible interfaces for a variety of devices, making everyday tasks easier for those with limb differences – as well as in wider applications such as rehabilitation and robotics.

Visit our jobs site to find out about some of the latest career opportunities at industry's biggest employers

The research paper, published in the journal Cyborg and Bionic Systems, can be read in full here.