Harvard team embeds touch sensitivity into soft robotics

Soft robotic grippers that can detect pressure, temperature, touch and movement could have applications in surgery and materials handling

soft robotic
The three grippers can detect a range of stimuli as the ionic liquid from which they are printed changes shape Image: Harvard SEAS

Building on pioneering work by Jennifer Lewis, Professor of Biologically Inspired Engineering at the John A Paulson School of Engineering and Applied Sciences (SEAS), the Harvard team used a technique known as embedded 3D printing to incorporate an organic ionic liquid-based conductive ink into the elastic polymer materials that comprise soft robots.

The robot made by the team consisted of three grippers, with multiple contact sensors embedded into their structure. In a paper in Advanced Materials, the researchers detail how they tested the grippers’ ability to sense inflation pressure, curvature, contact and temperature.

"To date, most integrated sensor/actuator systems used in soft robotics have been quite rudimentary," said Michael Wehner, formerly postdoctoral fellow at SEAS, co-author of the paper, who has now moved to the University of California, Santa Cruz. "By directly printing ionic liquid sensors within these soft systems, we open new avenues to device design and fabrication that will ultimately allow true closed loop control of soft robots."

The research represents a major advance because most sensors to detect these stimuli are rigid constructions that cannot match the fluid movements of an elastomeric robot. The embedded sensors work by detecting varying electrical conductive properties in the ionic liquid as it changes shape.

"Our research represents a foundational advance in soft robotics," said Ryan Truby, first author of the paper and a recent Ph.D. graduate at SEAS. "Our manufacturing platform enables complex sensing motifs to be easily integrated into soft robotic systems." Moreover, he added, it enables the integration of soft sensing and actuation within one system. Another co-author, Prof Robert Wood, said that the research would enable a step change in how robots are conceived and designed, "moving away from sequential processes and creating complex and monolithic robots with embedded sensors and actuators."

The next phase in the work is to use machine learning to train the new devices in how to grasp objects of varying size, shape, surface texture and temperature. The team believes that such robots will find applications in sectors where the sense of touch is invaluable, such as picking and handling soft and delicate fruit, and in robotic surgery on internal organs.

See the grippers in this video from Harvard SEAS: