The aerohaptics system pairs volumetric display technology with a leap motion sensor and precisely controlled jets of air to create the sensation of touch on users’ hands, fingers and wrists. Researchers hope the EPSRC-funded project could lead to advances in remote interaction, such as in teleconferencing and surgical applications.
Developed by the university’s Bendable Electronics and Sensing Technologies (BEST) research group, the system is based around a pseudo-holographic display which uses glass and mirrors to make a 2D image appear to hover in space — a modern variation on a 19th century illusion technique known as Pepper’s Ghost.
Published in Advanced Intelligent Systems, the team’s paper describes how they used the system to create a realistic sensation of bouncing a basketball. Pairing a computer-generated 3D image of a basketball with the leap motion sensor, the system varies the direction and force of the airflow to create aerohaptic feedback.
Users can reportedly ‘feel’ the rounded shape of the ball roll from their fingertips when it is bounced, and the slap in their palm when it returns, thanks to modulated feedback based on the ball’s virtual surface. Users can also push the virtual ball with varying force and sense the change in how hard the bounce feels in their palm, the team said.
Leader of the BEST group Prof. Ravinder Dahiya, who developed the system, told The Engineer that the current prototype does not have the ability to offer the user a sense of weight, but that it could be modified for this feeling.
This would require changing the location of nozzles, for example on the topside - in this prototype, the aerohaptic nozzle was left at the bottom of system to show the overall working, he explained.
Dahiya added that whilst haptic feedback and volumetric display technology has advanced in recent years, current haptic tech still involves wearable or handheld peripherals which can be costly and complex, a barrier to widespread adoption.
“Aerohaptics creates a convincing sensation of physical interaction on users’ hands at a relatively low cost,” he said. “We are already looking into adding additional functionality to the system, such as adding temperature control to the airflow to deepen the sensation of interacting with hot or cool objects.”
Dahiya added that the system could form the basis for creating convincing, interactive 3D renderings of real people for teleconferences, and could help surgeons to perform procedures in virtual spaces or even command robots to perform surgeries.
The system could, for example, fulfil needs in these applications that could not be met by other technologies such as VR, he said.
"Current VR devices do not provide natural haptic feedback. The user needs to wear them, and if it is a group of users participating in a project then everyone needs to wear a VR device. Our prototype does not require users to wear any such additional equipment. Multiple users can stand around the volumetric display and engage with virtual objects as we normally do in real life.
"For example, multiple clinicians could view, feel and discuss the features of hard tumour cells. They could also involve the patient in the discussion before surgery. This would give the patient more information and offer them additional confidence in the procedure."