Wrist movements and sensors enable robotic hand to grasp and hold onto objects

Wrist movements and tactile ‘skin’ sensors are helping a robotic hand to grasp – and not drop – a range of objects.

Researchers have designed a low-cost, energy-efficient robotic hand that can grasp a range of objects – and not drop them – using just the movement of its wrist and the feeling in its ‘skin’
Researchers have designed a low-cost, energy-efficient robotic hand that can grasp a range of objects – and not drop them – using just the movement of its wrist and the feeling in its ‘skin’ - Cambridge University

Researchers from the University of Cambridge designed the soft, 3D printed robotic hand that cannot independently move its fingers but can perform a range of complex movements.

According to the team, the robot hand was trained to grasp different objects and was able to predict whether it would drop them by using the information provided by tactile sensors placed on its ‘skin’. This type of passive movement makes the robot far easier to control and far more energy-efficient than robots with fully motorised fingers. The team’s results are detailed in Advanced Intelligent Systems.

Humans instinctively know how much force to use when picking up a fragile item like an egg, but for a robot this is a challenge: too much force, and the egg could shatter; too little, and the robot could drop it. In addition, a fully actuated robot hand, with motors for each joint in each finger, requires a significant amount of energy. In Professor Fumiya Iida’s Bio-Inspired Robotics Laboratory in Cambridge’s Department of Engineering, researchers have been developing potential solutions to both problems.

“In earlier experiments, our lab has shown that it’s possible to get a significant range of motion in a robot hand just by moving the wrist,” said co-author Dr Thomas George-Thuruthel, who is now based at University College London (UCL) East. “We wanted to see whether a robot hand based on passive movement could not only grasp objects, but would be able to predict whether it was going to drop the objects or not, and adapt accordingly.”

The team carried out over 1200 tests with the robot hand, observing its ability to grasp small objects without dropping them. The robot was initially trained using small 3D printed plastic balls, and grasped them using a pre-defined action obtained through human demonstrations.

“This kind of hand has a bit of springiness to it: it can pick things up by itself without any actuation of the fingers,” said first author Dr Kieran Gilday, who is now based at EPFL in Lausanne, Switzerland. “The tactile sensors give the robot a sense of how well the grip is going, so it knows when it’s starting to slip. This helps it to predict when things will fail.”

MORE FROM ROBOTICS & UAVs

The robot used trial and error to learn what kind of grip would be successful. After finishing the training with the balls, it then attempted to grasp different objects including a peach, a computer mouse and a roll of bubble wrap. In these tests, the hand was able to grasp 11 of 14 objects.

“The robot learns that a combination of a particular motion and a particular set of sensor data will lead to failure, which makes it a customisable solution,” Gilday said in a statement. “The hand is very simple, but it can pick up a lot of objects with the same strategy.”

“We can get lots of good information and a high degree of control without any actuators, so that when we do add them, we’ll get more complex behaviour in a more efficient package,” said Iida

A fully actuated robotic hand, in addition to the amount of energy it requires, is also a complex control problem. The passive design of the Cambridge-designed hand, using a small number of sensors, is easier to control, provides a wide range of motion, and streamlines the learning process.

In future, the system could be expanded in several ways, such as by adding computer vision capabilities, or teaching the robot to exploit its environment, which would enable it to grasp a wider range of objects.

This work was funded by UK Research and Innovation (UKRI), and Arm Ltd. Fumiya Iida is a Fellow of Corpus Christi College, Cambridge.