Predictive touch keeps eyes on the road and fingers off screens

Motorists will soon be able to adjust temperature or entertainment settings with ‘predictive touch’, a contactless touchscreen system that improves driver safety and could reduce the spread of viruses.

Predictive touch
Predictive touch (Image: JLR)

Developed by Jaguar Land Rover and Cambridge University, the patented predictive touch technology is said to use artificial intelligence and sensors to predict a user’s intended target on the touchscreen - such as sat nav - without touching a button.

Interview: Elizabeth Hill of Jaguar Land Rover

Lab-tests and on-road trials showed the predictive touch technology could reduce a driver’s touchscreen interaction effort and time by up to 50 per cent, as well as limiting the spread of bacteria and viruses in a post COVID-19 world.

According to Jaguar Land Rover, artificial intelligence determines the item the user intends to select on the screen early in the pointing task, speeding up the interaction. The company add that a gesture tracker uses vision-based or radio frequency-based sensors to combine contextual information such as user profile, interface design and environmental conditions with data available from other sensors, such as an eye-gaze tracker, to infer the user’s intent in real time.

In a statement, Lee Skrypchuk, Human Machine Interface Technical Specialist, at Jaguar Land Rover, said: “As countries around the world exit lockdown, we notice how many everyday consumer transactions are conducted using touchscreens: railway or cinema tickets, ATMs, airport check-ins and supermarket self-service checkouts, as well as many industrial and manufacturing applications. Predictive touch technology eliminates the need to touch an interactive display and could therefore reduce the risk of spreading bacteria or viruses on surfaces.

“The technology also offers us the chance to make vehicles safer by reducing the cognitive load on drivers and increasing the amount of time they can spend focused on the road ahead.”

This software-based solution for contactless interactions is said to be at high technology readiness levels and can be integrated into existing touchscreens and interactive displays, provided the correct sensory data is available to support the machine learning algorithm.

Project leader Prof Simon Godsill from Cambridge University’s Department of Engineering said: “Touchscreens and other interactive displays are something most people use multiple times per day, but they can be difficult to use while in motion, whether that’s driving a car or changing the music on your phone while you’re running. We also know that certain pathogens can be transmitted via surfaces, so this technology could help reduce the risk for that type of transmission.”