Mobile phones could be given touch-sensitive screens without using expensive, rare materials thanks to a new acoustic recognition system.
TouchDevice, developed by a Cambridge University PhD student, uses the mobile’s existing microphone and a set of algorithms to work out where the user has touched the screen.
The system’s inventor, Jens Christensen, recently won an ICT Pioneer award from the EPSRC and then beat the other award winners in a Dragons’ Den-style business pitch to a panel of academics and industry experts.
The software can be retrofitted to existing phones that have a microprocessor but isn’t yet sophisticated enough to replicate all the functions of modern touchscreen smartphones, such as dragging items or double clicking.
‘Our aim is to implement an icon-based navigation system,’ Christensen told The Engineer. ‘One of the big benefits is that you’re able to make the entire surface of the phone touch sensitive rather than just the screen.’
This means that simple operations such as answering or rejecting a call can be done more easily (and admittedly by accident), for example when the phone is in a pocket.
The technology could also be applied to other objects such as tables, or even walls to turn them into switches.
Existing capacitive touchscreens are made with rare indium oxide using an expensive vacuum process, but Christensen hopes TouchDevice will enable more people in developing countries to own a smartphone.
‘There were slightly more than 300 million smartphones sold last year and more than one billion non-smartphones,’ he said. ‘It’s still a huge market and it was essentially our main goal to offer touchscreen ability on feature phones for emerging markets.’
TouchDevice records acoustic impulses from the user interacting with the screen and uses them to infer where it has been touched. Currently it can identify points with a distance of 1 to 1.5cm between them but has the potential for greater resolution.
The software can be used with just one microphone so is simpler than other sound-based touch systems that use multiple sensors, but it also has to be trained to work with each phone on which it is installed.
The system could potentially replace mechanical pressure-based buttons with ones that produce a specific clicking sound that is then picked up by the microphone.
Christensen said he was talking to tier-one manufacturers about implementing the technology, which could be rolled out within three years, but is also creating an app for existing high-specification devices using Google’s Android operating system.
‘The general challenge is understanding the variability that occurs when a user taps the phone,’ he said. ‘We can separate different touches but we need to focus on where they occur. Things such as temperature can also affect it.’