Described in the new journal Science Robotics, the prosthesis relies on light to sense curvature, elongation and force. The optical waveguide consists of a core through which light shines, and an outer surface that houses an LED and photodiode. As the hand bends, light is lost through the core. That variable loss of light is detected by the photodiode, allowing the robot to sense what it is touching.
“If no light was lost when we bend the prosthesis, we wouldn't get any information about the state of the sensor," said Prof Robert Shepherd, assistant professor of mechanical and aerospace engineering at Cornell. "The amount of loss is dependent on how it's bent."
The team, working out of Shepherd’s Organic Robotics Lab, used a four-step soft lithography process to produce the core and its cladding. Optical waveguides have been used for sensing since the 1970s, but 3D printing has helped dramatically simplify the process, allowing elastomeric sensors to be incorporated into the manufacture of soft robotics.
“Most robots today have sensors on the outside of the body that detect things from the surface," said doctoral student Huichan Zhao, lead author of the paper.
"Our sensors are integrated within the body, so they can actually detect forces being transmitted through the thickness of the robot, a lot like we and all organisms do when we feel pain, for example."
The robotic hand was used to carry out several tasks, including grasping and testing for shape and texture. In another test, the prosthesis was able to detect which of three tomatoes was ripe by gauging their softness.