MIT robotic hand identifies objects in single grasp

A robotic hand with high-resolution touch sensing to accurately identify an object after gripping it once has been developed by a team at MIT.

MIT researchers developed a soft-rigid robotic finger that incorporates powerful sensors along its entire length, enabling them to produce a robotic hand that could accurately identify objects after only one grasp
MIT researchers developed a soft-rigid robotic finger that incorporates powerful sensors along its entire length, enabling them to produce a robotic hand that could accurately identify objects after only one grasp - Courtesy of the researchers

Unlike other designs, MIT’s robotic finger is built with a rigid skeleton encased in a soft outer layer containing high-resolution sensors under its transparent ‘skin.’ The sensors use a camera and LEDs to gather visual information about an object’s shape and provide continuous sensing along the length of the finger.

Using this design, the researchers built a three-fingered robotic hand that could identify objects after one grasp with about 85 per cent accuracy. The rigid skeleton makes the fingers strong enough to pick up heavy items, while the soft skin enables them to securely grasp a pliable item without crushing it.

“Having both soft and rigid elements is very important in any hand, but so is being able to perform great sensing over a really large area, especially if we want to consider doing very complicated manipulation tasks like what our own hands can do. Our goal with this work was to combine all the things that make our human hands so good into a robotic finger that can do tasks other robotic fingers can’t currently do,” said mechanical engineering graduate student Sandra Liu, co-lead author of a research paper on the robotic finger.

MORE FROM ROBOTICS

The robotic finger is comprised of a rigid, 3D-printed endoskeleton placed in a mould and encased in a transparent silicone skin. The researchers designed the mould with a curved shape so the robotic fingers are slightly curved when at rest, like human fingers.

“Silicone will wrinkle when it bends, so we thought that if we have the finger moulded in this curved position, when you curve it more to grasp an object, you won’t induce as many wrinkles. Wrinkles are good in some ways - they can help the finger slide along surfaces very smoothly and easily - but we didn’t want wrinkles that we couldn’t control,” Liu said in a statement.

The endoskeleton of each finger contains a pair of GelSight touch sensors, embedded into the top and middle sections, underneath the transparent skin. The sensors are placed so the range of the cameras overlaps slightly, giving the finger continuous sensing along its entire length.

According to MIT, the GelSight sensor is composed of a camera and three coloured LEDs; when the finger grasps an object, the camera captures images as the coloured LEDs illuminate the skin from the inside.

Using the illuminated contours that appear in the soft skin, an algorithm performs backward calculations to map the contours on the grasped object’s surface. The researchers trained a machine-learning model to identify objects using raw camera image data.

The researchers encountered challenges during fabrication, such as the silicone skin peeling off over time. This was overcome by adding small curves along the hinges between the joints in the endoskeleton.

When the finger bends, the bending of the silicone is distributed along the tiny curves, which reduces stress and prevents peeling. They also added creases to the joints so the silicone is not squashed as much when the finger bends.

Once they had perfected the design, the researchers built a robotic hand using two fingers arranged in a Y pattern with a third finger as an opposing thumb. The hand captures six images when it grasps an object and sends those images to a machine-learning algorithm which uses them as inputs to identify the object.

Because the hand has tactile sensing covering all its fingers, it can gather rich tactile data from a single grasp.

The research will be presented this week at the 2023 RoboSoft Conference in Singapore.