Changing perceptions: the control and display technologies shaping the car of the future

Advanced control and display technologies that monitor our moods and augment our view of the road are poised to reshape our relationship with the car. Jon Excell reports

From the rise of autonomy to the arrival of the mass-market electric car, the technology that drives and controls our vehicles is undergoing a period of profound change.

dWhile once they were little more than elegant chunks of mechanical engineering designed to get us from A to B, our cars are rapidly evolving into sensor-rich consumer devices: bristling with intelligence and connected to the world around them.

head-up display
HUDs that project directly onto the windshield are a key area of development Image: JLR

But as they become smarter and more sophisticated, what does this mean for the drivers and passengers? How will we communicate with our vehicles and make sense of the vast amounts of information they gather? And how will their intelligence be used to improve and enhance our experience rather than plunge us into a thick data-fog of distractions?

Addressing these challenges is one of the car industry’s most pressing and exciting areas of research. And a host of advanced control and display technologies – from gesture control systems that create the illusion of touch in mid-air to mind-reading steering wheels able to monitor a driver’s alertness – promise to fundamentally reshape our relationship with the car.

A key area of development is in the field of head-up displays, or HUDs: transparent screens that place information directly in a driver’s line of sight and reduce the need to look away from the road.

HUDs have been available on some production vehicles for a number of years, primarily in the form of so-called combiner devices that pop out of the dashboard and overlay the driver’s view with basic information such as speed and navigation cues.

But the next generation of the technology promises a new level of sophistication, with much larger augmented reality (AR) displays enhancing the driver’s view of the road with a rich array of navigation prompts, safety cues and even the kind of infrared night-vision capabilities more commonly associated with the military.

Cameras inside the car track the position of the driver’s head and eyes

In a striking illustration of the way in which the worlds of automotive and consumer electronics are moving closer together, advanced HUDs were much in evidence at this year’s CES show in Las Vegas, where a range of car-makers and automotive suppliers demonstrated different visions for the future. Among them, electronics giant Panasonic unveiled an AR system that dispenses with the combiner screen and instead projects images directly onto the inside of the windscreen.

Demonstrated on board a modified Renault Twizy, the system is claimed to be one of the largest and most sophisticated automotive HUDs yet developed. Processing information gathered by eight cameras mounted on the outside of the car, the technology augments the driver’s view with information from up to 10m in front of the car, and – with a field-of-view of 12° to the horizontal and 5° to the vertical – is claimed to cover a larger portion of the driver’s observable world than any other automotive HUD. Cameras inside the car track the position of the driver’s head and eyes to ensure that this augmented view is always placed precisely within line of sight.

ultrasonic haptic display
Bosch’s concept for an ultrasonic haptic display

It is an innovation that Fabien Roth, general manager at Panasonic Automotive & Industrial Europe, believes could open up a range of new application areas for HUDs.

He said: “You have traditional information such as speed, different warnings and navigation, but you can also use it for safety – for instance with lane departure systems – and you can even provide contextual information. For example, you could indicate with an arrow where a restaurant or petrol station is and, because the car is connected, you could provide info saying: ‘There’s one table free at 8 o’clock. Do you want me to book it?’”

Although Panasonic’s system is very much a proof-of concept, there is clearly a high level of interest across the industry in this kind of technology. Lee Skrypchuk, a human machine interface (HMI) technical specialist with Jaguar Land Rover (JLR), told The Engineer that HUD is a major part of the UK firm’s strategic roadmap. “We’ve talked about enhancing the view of the driver and view of the occupant to give them more vision of road or to augment information from the real world,” he said.

A key JLR initiative in this space is its so-called virtual windscreen project, which, along with displaying AR information on the windscreen, is also exploring the use of screens in the car’s roof pillars to eliminate blind spots and give the driver a 360° view around the vehicle. The firm has also been exploring a concept dubbed Transparent Bonnet, which uses cameras in the grill to stream data to a HUD and create a see-through view of the terrain through the bonnet.

In the longer term, Skrypchuk anticipates the development of windscreen-wide HUDs that augment a driver’s entire view. However, there are significant challenges to achieving this, he said, not least the wide variety of shapes and sizes of windscreen.

“The windscreen is a complex beast and starting off with an image at reasonable size and resolution and manipulating it up to windscreen size is quite difficult.”

JCR’s Mindsense concept
JLR’s Mindsense concept uses steering-wheel sensors to monitor brain waves

But HUDs are not the only type of advanced display system under development. For example, BMW used CES to demonstrate a holographic display that floated in mid-air directly next to the steering wheel. This HoloActive Touch system is designed to complement BMW’s existing gesture control technology by enabling users to interact with the display through hand gestures.

Even more intriguingly, a number of firms are looking at pairing gesture control systems with a form of haptic or tactile feedback that enables users to actually feel virtual switches and buttons.

In a striking example of this, engineers at Bosch recently demonstrated a concept car equipped with an ultrasonic haptic system that guided the driver’s hand to the right place to perform a gesture command. Based on technology developed by University of Bristol spinout firm Ultrahaptics Ltd, the system uses ultrasound to project sensations through the air. For the Bosch application, it was used to create the sensation of a forcefield around the gesture interaction area, but the technology can be employed to produce a range of more complex tactile sensations.

If you combined our technology with a gesture control system, you could press buttons or operate controls in the air and get immediate tactile feedback

Explaining how the underpinning technology works, Ultrahaptics CTO Tom Carter said: “The hardware itself is made up of a small collection of ultrasound speakers. We trigger each of the speakers with very slight time differences between them so that the sound waves travel through the air and all arrive at the same point at the same time. At the point at which they all overlap, you get one very localised point of high pressure and at that point there’s just enough force to slightly displace the surface of the skin.”

By controlling that effect, he explained, the system is able to create different frequency vibrations on the skin that can replicate a range of sensations. And the Bosch application is just a taster of what carmakers may use the technology for in the future.

“If you combined our technology with a gesture control system, you could press buttons or operate controls in the air and get immediate tactile feedback as if they were actually there,” added Carter.

In the meantime, Jürgen Cordes, Bosch project manager for multimedia, is excited about the technology’s potential to reduce distractions and help keep the driver’s eyes on the road. He told The Engineer the firm was hoping to work with OEM partners on taking the technology closer to production.

While the current vision is to use these kinds of control system to operate non-critical features, JLR’s Skrypchuk believes that, in the longer term, they could play a more direct role in controlling the vehicle.

Panasonic’s AR windshield HUD
Panasonic’s AR windshield HUD boasts one of the widest fields of view yet

“Once we move into autonomy, we’ll start to see some really unique methods of controlling the car,” he said, “not necessarily where you’re steering or braking but maybe indicating to the vehicle that you want it to carry out a manoeuvre, or pointing towards a certain direction that you want the car to go in.”

But while they’re compelling propositions don’t these technologies risk overwhelming and confusing the driver?

A key part of the technology development process is ensuring that this doesn’t happen. For example, at JLR the engineering team works closely with human factors specialists and psychologists to ensure that the interfaces are truly user friendly.

“We spend a reasonable amount of time creating simulators of the technologies so we can test them with a range of participants to see how different age groups, different cognitive abilities, different ethnic backgrounds can affect the performance of the system,” said Skrypchuk.

“A large amount of the testing will look at the effects on driving activity of the technologies that we’re putting in the car. When information goes into the user’s mind, what does that invoke as a response? Does it cause issues such as cognitive tunnelling or high levels of workload?”

Fusion x64 TIFF File
BMW's HoloActive Touch system

Another factor in ensuring these technologies work in perfect harmony with the driver is coupling them with systems that monitor a driver’s behaviour and state of mind.

VW, for instance, recently announced that it was working with computing giant NVIDIA on the development of an artificially intelligent ‘cockpit’ that monitors a driver’s behaviour and uses AI to anticipate his or her needs. Toyota recently demonstrated similar concepts on board its Concept-i vehicle.

Meanwhile, JLR is looking at a range of systems for monitoring everything from eye movement and heart rate to skin resistivity and even through a project known as

Mind Sense, a driver’s brainwaves.

As previously reported in The Engineer, this latter initiative is exploring the use of steering-wheel sensors to monitor brain activity and gauge a driver’s level of alertness.

The technology works by monitoring the presence of theta waves – a distinct form of brainwave that is prevalent during daydreaming. JLR believes that, by detecting a spike in these signals, the system will be able to detect when a driver’s concentration is waning and will trigger some form of alert to raise the driver’s awareness.

As well as easing the live interaction of the driver with all of this technology, these kinds of system can also feed into the design activity.

“Not only can it help in understanding whether they’re cognisant enough to take control of an autonomous car,” said Skrypchuk, “but it can tell us how they’re finding the interface experience, and we can use that information to feed into our design process. For instance, if we find a particular interface is demanding for the user, we can investigate ways of making it less demanding.”

Where people are perhaps less practised at driving, these technologies can be there to support them

It’s all fascinating stuff. However, given that the long-term vision for the automotive sector is the fully autonomous car, will the passenger of the future in fact need any help? Could this golden era of HMI innovation actually end up being relatively short-lived?

Skrypchuk doesn’t think so. “The day where you’re able to get in a car, push a button and travel from A to B is some way off,” he said, “but even in those situations the technologies are still relevant. How they’re used will change and you won’t have to worry so much about distractions and the demand they’re placing upon the user.”

He added that the technologies may also become a key feature when car passengers of the future feel like having a good old-fashioned drive.

“Some people may still want to drive,” he said, “and in those situations, where people are perhaps less practised at driving, these technologies can be there to support them when they want to take control of the vehicle.”