Researchers create screen you can feel without touching it

Computer users could soon be able to “feel” what is happening on a screen without actually touching it, thanks to a breakthrough at Bristol University.

The research team behind “UltraHaptic” technology, which uses ultrasound to create vibrations that can be felt at precise points in mid-air, has now developed the system to be used as an interface with a computer display.

The engineers at the Bristol Interaction and Graphics group have demonstrated how to pass the ultrasound signals through a screen and to create multiple tactile points at once, which can represent different buttons or functions on the display.

They hope this will make it easier to control devices entirely by hand gestures and make it possible to operate them without even looking at a display, for example if a car driver wanted to use their sat nav without taking their eyes off the road.

‘Even if you provide [haptic] feedback on a touch screen you have to fumble around pressing all the buttons, whereas with our system you can wave your hand vaguely in the air and you’ll get the feeling on the hand,’ said Tom Carter, a PhD student working on the technology.

‘We can give different points of feeling at the same time that feel different so you can assign a meaning to them. So the “play” button feels different from the “volume” button and you can feel them just by waving your hand.’

The Bristol group has been working on the technology for several years and initially developed it from car parking sensors that use ultrasound waves to detect obstacles.

By using an array to combine multiple ultrasonic beams into a concentrated point in the air, the researchers were able to create vibrations powerful enough to be felt by the skin.

‘Touchless haptics is probably the Holy Grail,’ said Dr Geoff Merrett of Southampton University, who was not involved in the research but has experience developing haptic technology. ‘If you look towards games consoles, everything is going away from holding things.’

Researchers from the University of Tokyo first demonstrated the haptic ultrasound principle in 2008, and earlier this year a team at Disney Research in the US unveiled a system that pumped out vortices of air to create a touchless haptic sensation.

But the Bristol team has now managed to combine its UltraHaptics system with a computer display, using a screen with tiny perforations that allow the ultrasound to pass through but still reflects the light.

An algorithm enables the ultrasound array to create different vibrations patterns at different tactile points. Once the user has found a point, a Leap Motion gesture sensor detects their hand movement, allowing them to interact with the screen without touching it.

‘We can produce low frequency pulsing sensations, which we’ve been told feel like “dry rain”, the same force as rain falling on your hand at frequencies of 4hz or so,’ said Carter. ‘And if you move up to higher frequencies of 125-250Hz then it’s more like a foamy, spongey feeling.’

PhD student Tom Carter was among the Bristol Interaction and Graphics researchers who developed the UltraHaptics system.

By creating two tactile points next to one another, the system also allows users to make a pinching movement for grabbing icons on the screen, although the researchers have so far found two points must be at least 3cm apart for users to reliably distinguish between them.

Having run basic user trials, the Bristol team is now exploring how the technology could be used. They have produced a maps application that uses the haptic feedback to provide the user extra information as they pass their hand over the screen, for example using higher vibration frequencies to indicate areas of higher population density.

Southampton University’s Geoff Merrett has developed haptic technology for medical rehabilitation devices for stroke patients and said the UltraHaptics system could have applications in this area.

‘The more realistic you can make a sense the better. The fact that you don’t need to wear things would be a massive benefit for a stroke patient because often putting something on the hand is the biggest problem. Their hand tends to seize up and so to get the kind of devices we were looking at onto the fingers was quite difficult.

‘That could be a problem in this situation because you can’t stimulate the fingertip with the ultrasound system unless it can see the fingertip. But I think as we move towards less contact devices the better [it will be].’

The Bristol researchers will present a paper on their work at this week’s ACM Symposium on User Interface Software and Technology at St Andrews University.