The wrist-worn device, created at the Massachusetts Institute of Technology (MIT), is said to operate like two extra fingers adjacent to the little finger and thumb.
A novel control algorithm enables it to move in sync with the wearer’s fingers to grasp objects of various shapes and sizes. Wearing the robot, a user could use one hand to hold the base of a bottle while twisting off its cap.
‘This is a completely intuitive and natural way to move your robotic fingers,’ said Harry Asada, the Ford Professor of Engineering in MIT’s Department of Mechanical Engineering. ‘You do not need to command the robot, but simply move your fingers naturally. Then the robotic fingers react and assist your fingers.’
In a statement, Asada said that with some training people might come to perceive the robotic fingers as part of their body.
He hopes that the two-fingered robot may assist people with limited dexterity in performing routine household tasks, such as opening jars and lifting heavy objects. He and graduate student Faye Wu presented a paper on the robot this week at the Robotics: Science and Systems conference in Berkeley, California.
The robot, which the researchers have dubbed supernumerary robotic fingers, consists of actuators linked together to exert forces as strong as those of human fingers during a grasping motion.
To develop an algorithm to coordinate the robotic fingers with a human hand, the researchers first looked to the physiology of hand gestures, learning that a hand’s five fingers are highly coordinated. While a hand may reach out and grab an orange in a different way than, say, a mug, just two general patterns of motion are used to grasp objects: bringing the fingers together, and twisting them inwards. A grasp of any object can be explained through a combination of these two patterns.
The researchers hypothesized that a similar biomechanical synergy may exist not only among the five human fingers, but also among seven. To test the hypothesis, Wu wore a glove outfitted with multiple position-recording sensors, and attached to her wrist via a light brace.
Wu then grasped a range of objects and manually positioned the robotic fingers to support the object. She recorded hand and robotic joint angles multiple times with various objects, then analysed the data, finding that every grasp could be explained by a combination of two or three general patterns among all seven fingers.
The researchers used this information to develop a control algorithm to correlate the postures of the two robotic fingers with those of the five human fingers. Asada explained that the algorithm essentially ‘teaches’ the robot to assume a certain posture that the human expects the robot to take.
For now, the robot mimics the grasping of a hand, closing in and spreading apart in response to a human’s fingers but Wu would like to take the robot one step further, controlling not just position, but also force.
‘Right now we’re looking at posture, but it’s not the whole story,’ Wu said. ‘There are other things that make a good, stable grasp. With an object that looks small but is heavy, or is slippery, the posture would be the same, but the force would be different, so how would it adapt to that? That’s the next thing we’ll look at.’
Wu also notes that certain gestures may differ slightly from person to person, and ultimately, a robotic aid may have to account for personal grasping preferences. To that end, she envisions developing a library of human and robotic gesture correlations. As a user works with the robot, it could learn to adapt to match his or her preferences, discarding others from the library.