Advanced control and display technologies that monitor our moods and augment our view of the road are poised to reshape our relationship with the car. Jon Excell reports
From the rise of autonomy to the arrival of the mass-market electric car, the technology that drives and controls our vehicles is undergoing a period of profound change.
dWhile once they were little more than elegant chunks of mechanical engineering designed to get us from A to B, our cars are rapidly evolving into sensor-rich consumer devices: bristling with intelligence and connected to the world around them.

But as they become smarter and more sophisticated, what does this mean for the drivers and passengers? How will we communicate with our vehicles and make sense of the vast amounts of information they gather? And how will their intelligence be used to improve and enhance our experience rather than plunge us into a thick data-fog of distractions?
Addressing these challenges is one of the car industry’s most pressing and exciting areas of research. And a host of advanced control and display technologies – from gesture control systems that create the illusion of touch in mid-air to mind-reading steering wheels able to monitor a driver’s alertness – promise to fundamentally reshape our relationship with the car.
A key area of development is in the field of head-up displays, or HUDs: transparent screens that place information directly in a driver’s line of sight and reduce the need to look away from the road.
HUDs have been available on some production vehicles for a number of years, primarily in the form of so-called combiner devices that pop out of the dashboard and overlay the driver’s view with basic information such as speed and navigation cues.
But the next generation of the technology promises a new level of sophistication, with much larger augmented reality (AR) displays enhancing the driver’s view of the road with a rich array of navigation prompts, safety cues and even the kind of infrared night-vision capabilities more commonly associated with the military.
Cameras inside the car track the position of the driver’s head and eyes
In a striking illustration of the way in which the worlds of automotive and consumer electronics are moving closer together, advanced HUDs were much in evidence at this year’s CES show in Las Vegas, where a range of car-makers and automotive suppliers demonstrated different visions for the future. Among them, electronics giant Panasonic unveiled an AR system that dispenses with the combiner screen and instead projects images directly onto the inside of the windscreen.
Demonstrated on board a modified Renault Twizy, the system is claimed to be one of the largest and most sophisticated automotive HUDs yet developed. Processing information gathered by eight cameras mounted on the outside of the car, the technology augments the driver’s view with information from up to 10m in front of the car, and – with a field-of-view of 12° to the horizontal and 5° to the vertical – is claimed to cover a larger portion of the driver’s observable world than any other automotive HUD. Cameras inside the car track the position of the driver’s head and eyes to ensure that this augmented view is always placed precisely within line of sight.

It is an innovation that Fabien Roth, general manager at Panasonic Automotive & Industrial Europe, believes could open up a range of new application areas for HUDs.
He said: “You have traditional information such as speed, different warnings and navigation, but you can also use it for safety – for instance with lane departure systems – and you can even provide contextual information. For example, you could indicate with an arrow where a restaurant or petrol station is and, because the car is connected, you could provide info saying: ‘There’s one table free at 8 o’clock. Do you want me to book it?’”
Although Panasonic’s system is very much a proof-of concept, there is clearly a high level of interest across the industry in this kind of technology. Lee Skrypchuk, a human machine interface (HMI) technical specialist with Jaguar Land Rover (JLR), told The Engineer that HUD is a major part of the UK firm’s strategic roadmap. “We’ve talked about enhancing the view of the driver and view of the occupant to give them more vision of road or to augment information from the real world,” he said.
A key JLR initiative in this space is its so-called virtual windscreen project, which, along with displaying AR information on the windscreen, is also exploring the use of screens in the car’s roof pillars to eliminate blind spots and give the driver a 360° view around the vehicle. The firm has also been exploring a concept dubbed Transparent Bonnet, which uses cameras in the grill to stream data to a HUD and create a see-through view of the terrain through the bonnet.
In the longer term, Skrypchuk anticipates the development of windscreen-wide HUDs that augment a driver’s entire view. However, there are significant challenges to achieving this, he said, not least the wide variety of shapes and sizes of windscreen.
“The windscreen is a complex beast and starting off with an image at reasonable size and resolution and manipulating it up to windscreen size is quite difficult.”

But HUDs are not the only type of advanced display system under development. For example, BMW used CES to demonstrate a holographic display that floated in mid-air directly next to the steering wheel. This HoloActive Touch system is designed to complement BMW’s existing gesture control technology by enabling users to interact with the display through hand gestures.
Even more intriguingly, a number of firms are looking at pairing gesture control systems with a form of haptic or tactile feedback that enables users to actually feel virtual switches and buttons.
In a striking example of this, engineers at Bosch recently demonstrated a concept car equipped with an ultrasonic haptic system that guided the driver’s hand to the right place to perform a gesture command. Based on technology developed by University of Bristol spinout firm Ultrahaptics Ltd, the system uses ultrasound to project sensations through the air. For the Bosch application, it was used to create the sensation of a forcefield around the gesture interaction area, but the technology can be employed to produce a range of more complex tactile sensations.
If you combined our technology with a gesture control system, you could press buttons or operate controls in the air and get immediate tactile feedback
Explaining how the underpinning technology works, Ultrahaptics CTO Tom Carter said: “The hardware itself is made up of a small collection of ultrasound speakers. We trigger each of the speakers with very slight time differences between them so that the sound waves travel through the air and all arrive at the same point at the same time. At the point at which they all overlap, you get one very localised point of high pressure and at that point there’s just enough force to slightly displace the surface of the skin.”
By controlling that effect, he explained, the system is able to create different frequency vibrations on the skin that can replicate a range of sensations. And the Bosch application is just a taster of what carmakers may use the technology for in the future.
“If you combined our technology with a gesture control system, you could press buttons or operate controls in the air and get immediate tactile feedback as if they were actually there,” added Carter.
In the meantime, Jürgen Cordes, Bosch project manager for multimedia, is excited about the technology’s potential to reduce distractions and help keep the driver’s eyes on the road. He told The Engineer the firm was hoping to work with OEM partners on taking the technology closer to production.
While the current vision is to use these kinds of control system to operate non-critical features, JLR’s Skrypchuk believes that, in the longer term, they could play a more direct role in controlling the vehicle.

“Once we move into autonomy, we’ll start to see some really unique methods of controlling the car,” he said, “not necessarily where you’re steering or braking but maybe indicating to the vehicle that you want it to carry out a manoeuvre, or pointing towards a certain direction that you want the car to go in.”
But while they’re compelling propositions don’t these technologies risk overwhelming and confusing the driver?
A key part of the technology development process is ensuring that this doesn’t happen. For example, at JLR the engineering team works closely with human factors specialists and psychologists to ensure that the interfaces are truly user friendly.
“We spend a reasonable amount of time creating simulators of the technologies so we can test them with a range of participants to see how different age groups, different cognitive abilities, different ethnic backgrounds can affect the performance of the system,” said Skrypchuk.
“A large amount of the testing will look at the effects on driving activity of the technologies that we’re putting in the car. When information goes into the user’s mind, what does that invoke as a response? Does it cause issues such as cognitive tunnelling or high levels of workload?”

Another factor in ensuring these technologies work in perfect harmony with the driver is coupling them with systems that monitor a driver’s behaviour and state of mind.
VW, for instance, recently announced that it was working with computing giant NVIDIA on the development of an artificially intelligent ‘cockpit’ that monitors a driver’s behaviour and uses AI to anticipate his or her needs. Toyota recently demonstrated similar concepts on board its Concept-i vehicle.
Meanwhile, JLR is looking at a range of systems for monitoring everything from eye movement and heart rate to skin resistivity and even through a project known as
Mind Sense, a driver’s brainwaves.
As previously reported in The Engineer, this latter initiative is exploring the use of steering-wheel sensors to monitor brain activity and gauge a driver’s level of alertness.
The technology works by monitoring the presence of theta waves – a distinct form of brainwave that is prevalent during daydreaming. JLR believes that, by detecting a spike in these signals, the system will be able to detect when a driver’s concentration is waning and will trigger some form of alert to raise the driver’s awareness.
As well as easing the live interaction of the driver with all of this technology, these kinds of system can also feed into the design activity.
“Not only can it help in understanding whether they’re cognisant enough to take control of an autonomous car,” said Skrypchuk, “but it can tell us how they’re finding the interface experience, and we can use that information to feed into our design process. For instance, if we find a particular interface is demanding for the user, we can investigate ways of making it less demanding.”
Where people are perhaps less practised at driving, these technologies can be there to support them
It’s all fascinating stuff. However, given that the long-term vision for the automotive sector is the fully autonomous car, will the passenger of the future in fact need any help? Could this golden era of HMI innovation actually end up being relatively short-lived?
Skrypchuk doesn’t think so. “The day where you’re able to get in a car, push a button and travel from A to B is some way off,” he said, “but even in those situations the technologies are still relevant. How they’re used will change and you won’t have to worry so much about distractions and the demand they’re placing upon the user.”
He added that the technologies may also become a key feature when car passengers of the future feel like having a good old-fashioned drive.
“Some people may still want to drive,” he said, “and in those situations, where people are perhaps less practised at driving, these technologies can be there to support them when they want to take control of the vehicle.”
Right in front of me I’d prefer the satnav, instead of those round instruments. Instead of constant speed I’d like to have minimum safety distance control. For overtakers: Is there enough space in front of me and in front of the guy behind for me to change lanes?
The rain sensor could be used to increase the safety distance automatically.
I have never understood why no concept cars have gone for a ‘sidestick’ control approach similar to aircraft. We now have fully powered steering and brakes and cars capable of autonomy. The proof of concept for sidestick is well established in the computer game industry from the 80’s onwards with many of us now ‘middle aged’ gamers playing racing games on our 8 bit computers using the ubiquitous joystick. So simple even an 8 year old can do it. More importantly it gets rid of the big lump of metal that forms the steering wheel, column and pedal box which are so good at causing injury as they deform into the cabin and body under crash conditions. This has to be the biggest selling point for such an interface, and surely if its good enough for pilots (who baulked at fly by wire, and no physical feel of the controls) it will be amply sufficient for us drivers. A really clever designer could even make the stick removeable as the ultimate anti-theft device (i.e. no controls!).
Food for thought….. So many concept cars keep within ‘standard’ design and design houses don’t allow designers to go ‘outside the box’ (and all the pictures in the above article show a conventional control). In this case the benefits I can see far outweigh the negatives. Our steering and braking systems (in fact now most on board car systems) are all controlled by the computer to an extent anyway with SIF (safety integrated functions) becoming the norm in automotive manufacture. We now have the technology (helped by fully autonomous vehicles) to make this a reality.
Come on some-one lets see a real concept car!
Head-up displays, great (I myself advocated them for F1 back in the Sixties). But today’s fad for touch-screen displays and ‘infotainment’ are as as distracting as using a mobile phone when driving. Generally speaking, vehicle designers seem to have forgotten the meaning of ergonomics.
As long as it is remembered that mobile internet coverage is not universal and neither are functional and safe tarmac roads. Also people often want or need to go to places that are not direct road accessible!
Whatever of these gets done, please keep some cars available with straightforward i.e. present style controls like a 2002 basic! NEEDS a hand brake, no ambiguity. And as said above, touch screen is as fiddly as a screen type telephone or GPS, who allowed these in the middle of car not even in front of the driver. And with alternative displays! Please Keep It Simple, Stupid.
No comparison with aeroplanes, which operate in three dimensions.
GPS needs extra person in the car to work it, unless just straightforward “A to B”, no variations.
In the late 70s, early 80s my client Gentex was working on head-up displays for aircraft.
There had been reported cases where fighter aircraft had been lost, whilst the pilot was looking down at traditional displays. But be careful that you do NOT over-load drivers with too much data. We simple textile machinery technologists have been able to monitor -in real time- the many parameters which define functioning of all 1200 spindles on a frame: an avalance of data that no individual could possibly adsorb, let alone act upon. Surely (as we found out and introduced) it is only the exception(s) from the established norms that are of interest and value.
Like good management (and so I am told by orchestral conductors) it is the exceptions which require attention, not what happens normally.
Some-one above mentioned ergonomics. I always chuckle to myself when I hear that term mentioned in relation to car displays which have multiple menu’s etc and cause the driver to go ‘Head in Car’. Ergonomic in the car industry appears to have a different meaning ot the rest of the world.
Less number and positive switches please, with different textures or shapes that allow me to tell by feel which switch I have and what position its in without looking away from the road. A rotary knob with no index or start and end helps no-one especially if its linked to an in car menu or display and that’s the only way of telling what you’ve selected.
My old radio had two two tier knobs and 3 buttons, Volume on off / lower tier frequency switch, tuning up down/tone on lower tier and three presets. All could be done by touch alone, after very very short familiarisation and no real need to look at the dial.. I now have 2 knobs volume and menu settings something in the order of 20 buttons (excluding those on the steering wheel) and a display to tell me fairly essential other useless information. It is now a case of getting the passenger to operate anything other than the basic presets if in motion as I don’t wish to look away from the road for any longer , due to everyone else battling their in car electronics and not concentrating on the road.
I worked at Ford in the early 90’s and at the time push buttons on radios were all of the fad (getting away from progressive knobs with full counterclockwise power off/on). So Ford used an up button and down button for volume. If you were listening to the radio at high volume and wanted to turn the volume down quickly, you had to press the down button many times quickly or the mute button (another button). After hearing customers complain, they eventually went back to knobs. It comes down to do all these touch screens add benefit, or are they really just a nice novelties? Nice novelties will dies and benefits will live on.
TBH ‘stick feel’ has for a long time on many aircraft been artificial due to hydraulic controls rather direct rod or cable. Implementing it in a joystick would not be hard (indeed I think I had such a thing in the 00s on my computer, more the concept the pilots didn’t like at first.
Why oh why, is the question I must ask. JUST BECAUSE WE CAN, seems to be the driving force for all these gadgets and aids.
The Servicing of modern cars is already too expensive, problematic (you get to have to pay for them to use the diagnostics, which should be part of the business overhead anyway). All of this will increase the price of the vehicle, reduce the reliability, distract the driver etc.
KISS, Keep it simple, HUD yes, toys no.
Please can we have a master disable control to remove all unwanted attributes that may unfortunarely come along?
While you are all looking at the gamut of gizmos inside your average motor, who is looking out of the windows?
Take the bus, and leave the roads to us bikers.