The team at Carnegie Mellon University claims to have created a system small and light enough to attach to the bottom of virtual reality (VR) goggles, using airborne ultrasound waves to create sensations in the mouth.
Vivian Shen, a second-year PhD student in the university’s Robotics Institute, gave the potential use case example of a VR world with a drinking fountain, explaining that users would be able to feel a stream of water across their lips when leaning down to the VR ‘fountain’ in an immersive experience.
Working in the Future Interfaces Group (FIG), Shen collaborated with Craig Shultz, a post-doctoral fellow in the Human-Computer Interaction Institute (HCII), using the system to create such haptic effects as raindrops, mud splatter and crawling bugs. Shen and Shultz developed the system with Chris Harrison, associate professor in the HCII and director of the FIG lab.
Though the mouth is known for its sensitivity, researchers have had difficulty rendering haptic effects on it, Shen said, as VR users don’t like to put things in or cover their mouths. She noted that whilst a recent effort employed a tiny robotic arm that could flick a feather across the lips or spray them with water, this would not be practical for widespread use.
Ultrasound, however, has been used by researchers to deliver sensations to the hands, enabling them to create haptic effects such as virtual buttons that users can perceive themselves pushing.
Because ultrasound waves can travel through the air for short distances, they seemed a possible solution for mouth-based haptics, Shen said.
MORE VIRTUAL REALITY CONTENT FROM THE ENGINEER
Most people are familiar with medical ultrasound imaging and probes, though these devices are not known for vibrating or stimulating the skin. The team explained that these acoustic waves, which have frequencies above the threshold of human ears, can be used to create sensations by focusing them into a small area.
This effect is achieved by using multiple ultrasound-generating modules, or transducers. Like any sort of wave, the ultrasound waves produced by one transducer can interfere with those of other transducers — constructively, to amplify the waves, or destructively, to nullify them.
“If you time the firing of the transducers just right you can get them all to constructively interfere at one point in space,” said Shen. In this case, they targeted those points of peak amplitude on the lips, teeth and tongue. Subtly modulating the ultrasonic output also heightened the effect according to the team.
Shen pointed out that sensations are primarily limited to the hands and mouth, as forearms and torso, for example, lack sufficient nerve mechanoreceptors needed to feel the sensation.
According to the researchers, the CMU device is a phased array of 64 tiny transducers. The flat, half-moon shaped array is attached to the bottom of VR goggles so it rests just above the mouth.
Haptic effects consist of point impulses, swipes and persistent vibrations targeted on the mouth and synchronised with visual images. A variety of effects were evaluated using 16 volunteers, with all subjects reporting that mouth haptics enhanced their VR experience, the team said.
Mouth-specific effects, such as brushing teeth or feeling a bug walk across the lips, were reported most successful. Others, such as the feel of walking through cobwebs, were reported interesting but ‘less powerful’, as users expected to feel sensations over a larger part of the body than just the mouth.
“Our phased array strikes the balance between being really expressive and being affordable,” Shen said. Further work could add new haptic effects to the catalogue and make the device smaller and lighter.