Flexible sensor skin gives robots a sense of dexterity

Robots could soon handle objects with the same dexterity as humans thanks to a flexible sensor skin developed by engineers from the University of Washington and UCLA.

sensor skin
Bio-inspired sensor skin wraps around a finger or any other part of a robot to help convey touch (credit: UCLA Engineering)

The skin can be stretched over any part of a robot’s body – or prosthetic – to accurately convey information about shear forces and vibration that are critical to grasping and manipulating objects.

The bio-inspired robot sensor skin mimics the way a human finger experiences tension and compression as it slides along a surface or distinguishes different textures. It measures this tactile information with similar precision as human skin and is described in a paper published in Sensors and Actuators A: Physical.

“Robotic and prosthetic hands are really based on visual cues right now – such as, ‘Can I see my hand wrapped around this object?’ or ‘Is it touching this wire?’ But that’s obviously incomplete information,” said senior author Jonathan Posner, a UW professor of mechanical engineering and of chemical engineering.

“If a robot is going to dismantle an improvised explosive device, it needs to know whether its hand is sliding along a wire or pulling on it. To hold on to a medical instrument, it needs to know if the object is slipping. This all requires the ability to sense shear force, which no other sensor skin has been able to do well,” Posner said.

Some robots use fully instrumented fingers, but that sense of “touch” is limited to that appendage. Another approach is to wrap a robot appendage in a sensor skin, which provides better design flexibility, but these skins are yet to provide a full range of tactile information.

“Traditionally, tactile sensor designs have focused on sensing individual modalities: normal forces, shear forces or vibration exclusively. However, dexterous manipulation is a dynamic process that requires a multimodal approach. The fact that our latest skin prototype incorporates all three modalities creates many new possibilities for machine learning-based approaches for advancing robot capabilities,” said co-author Veronica Santos, a UCLA associate professor of mechanical and aerospace engineering.

The new stretchable electronic skin, which was manufactured at the UW’s Washington Nanofabrication Facility, is made from silicone rubber that is embedded with serpentine channels filled with electrically conductive liquid metal that won’t crack or fatigue when the skin is stretched.

When the skin is placed around a robot finger or end effector, these microfluidic channels are strategically placed on either side of where a human fingernail would be. When humans slide a finger across a surface, one side of the nailbed bulges out while the other side becomes taut under tension. The same thing happens with the robot or prosthetic finger because the microfluidic channels on one side of the nailbed compress while the ones on the other side stretch out.

When the channel geometry changes, so does the amount of electricity that can flow through them. The research team can measure these differences in electrical resistance and correlate them with the shear forces and vibrations that the robot finger is experiencing.

“Our electronic skin bulges to one side just like the human finger does and the sensors that measure the shear forces are physically located where the nailbed would be, which results in a sensor that performs with similar performance to human fingers,” said lead author Jianzhu Yin.