The technology aims to improve road safety by ‘seeing through’ objects to alert of potential hazards without distracting the driver. Using LiDAR (light detection and ranging) data to create ultra-HD holographic representations of road objects, the images are beamed directly to the driver’s eyes instead of 2D windscreen projections used in most head-up displays.
“Head-up displays are being incorporated into connected vehicles, and usually project information such as speed or fuel levels directly onto the windscreen in front of the driver, who must keep their eyes on the road,” said lead author and Cambridge engineering PhD candidate Jana Skirnewskaja. “However, we wanted to go a step further by representing real objects as in panoramic 3D projections.”
According to the team, early tests show that the images appear in the driver’s field of view according to their actual position, creating augmented reality. Researchers believe this could help drivers to see through visual obstructions in instances such as a large tree or truck hiding a road sign. The study has been published in Optics Express.
LiDAR, a remote sensing method which uses a laser pulse to measure the distance between the scanner and an object, was used to scan a busy street on the UCL campus in Central London. Co-author and geographer Phil Wilkes carried out testing using a technique called terrestrial laser scanning, sending out millions of pulses from multiple positions. LiDAR data was then combined with point cloud data to build a 3D model.
When the model was completed, researchers transformed objects on the street into holographic projections. The LiDAR data, in the form of point clouds, was processed by separation algorithms to identify and extract the target objects. Another algorithm was used to convert them into computer-generated diffraction patterns, and these data points were implemented into the optical set-up to project 3D holographic objects into the driver's field of view.
Wilkes explained that although data was captured from a stationary platform it would be similar to sensors used in the next generation of autonomous vehicles. To this end, researchers are working to miniaturise the optical components so that they can fit into a car, with vehicle tests on public roads in Cambridge to follow.
In future, the team hopes to personalise the layout of the head-up displays and create an algorithm capable of projecting several layers of different objects, which could be freely arranged in the driver’s field of vision. For example, in the first layer, a traffic sign at a further distance could be projected at a smaller size and in the second layer, a warning sign at closer distance could be displayed at a larger size.
“Every individual may have different preferences for their display options,” said Skirnewskaja. “For instance, the driver’s vital health signs could be projected in a desired location of the head-up display. Panoramic holographic projections could be a valuable addition to existing safety measures by showing road objects in real time.”