Cornell University researchers claim to have created a ‘skin’ sensor that could give robotic systems the ability to feel human sensation.
The team, led by Rob Shepherd, associate professor of Mechanical and Aerospace Engineering, published their paper in Science and are now hoping to commercialise the technology for physical therapy and sports medicine. The paper’s co-authors Hedan Bai and Shuo Li are currently working with Cornell’s Centre for Technology Licensing to patent the technology.
The fibre-optic sensor is said to combine low-cost LEDs and dyes, resulting in a stretchable sensor that could detect deformations such as pressure, bending and strain.
“Right now, sensing is done mostly by vision,” explained Shepherd. “We hardly ever measure touch in real life. This skin is a way to allow ourselves and machines to measure tactile interactions in a way that we now currently use the cameras in our phones. It’s using vision to touch. This is the most convenient and practical way to do it in a scaleable way.”
Bai is said to have drawn inspiration from silica-based fibre-optic sensors and developed a stretchable lightguide for multimodal sensing (SLIMS). The long tube contains a pair of polyurethane elastomeric cores. One core is transparent; the other is filled with absorbing dyes at multiple locations and connects to an LED. Each core is coupled with a red-green-blue sensor chip to register geometric changes in the optical path of light.
Researchers designed a 3D-printed glove with a SLIMS sensor running along each finger. They explained that the glove is powered by a lithium battery and equipped with Bluetooth so that it can transmit data to basic software, designed by Bai, reconstructing the glove’s movements and deformations in real time.
The team are looking into the way a SLIMS sensor can boost virtual and augmented reality experiences.
“Let’s say you want to have an augmented reality simulation that teaches you how to fix your car or change a tyre,” Shepherd said. “If you had a glove or something that could measure pressure, as well as motion, that augmented reality visualisation could say: ‘Turn and then stop, so you don’t over tighten your lug nuts.’ There’s nothing out there that does that right now, but this is an avenue to do it.”