Sensor-guided, driverless cars could be a step closer following a UK project to investigate the technologies needed for autonomous vehicles.

The project, Auto Taxi, has developed and tested safety-critical sensors and data-fusion systems to allow cars to gather and process complex information about the road ahead, including the presence of other vehicles.

During the first phase of the two-year project the team tested a combination of state-of-the-art sensors already used in some cars — lane departure warning video, active cruise control radar, lidar (which uses light to measure speed and distance) and ultrasonic sensors.

These sensors were then enhanced and two new prototype devices added: a stereo video system developed at Warwick University and an LED rangefinder.

The trials began with simple tests on a figure-of-eight track, before moving on to more complex scenarios to simulate realistic driving conditions, such as vehicles approaching from side roads and parking.

Although completely autonomous cars remain a long way off, in the near term technologies developed in the project could help to make conventional cars safer.

The LED device, a short-range optical sensor, showed particular promise during the tests, said Dr Alastair Buchanan, project manager for Auto Taxi at technology consultant TRW Conekt.

The optical system, although at an early stage of development, could be fitted to cars as a parking aid or a pre-crash sensor. The stereo video system could be used to detect obstacles on the road ahead, said Buchanan.

The technologies developed in the project could also be used for driverless taxis in environments such as airports and shopping centres. The team used the Urban Light Transport (Ultra) driverless taxis, being developed by Advanced Transport Systems, as a testbed for the sensors. These computer-controlled vehicles travel along a dedicated guideway, and the company has built a test track in Cardiff which was used to carry out the trials.

If autonomous vehicles are to become a reality, two sensors will need to be used for each sensing task, to provide aircraft-style redundancy, said Buchanan. ‘We have to have two sensors looking at the same thing, but they can’t fail in the same way. For example, optical sensors are badly affected by rain, so if it rains and all your sensors are optical, you have got a problem.’

To this end, aerospace researchers at

Bristol University

were brought in to develop a fault-tolerant data fusion system capable of self-diagnosis, while another project partner, Praxis High Integrity Systems, created a safety case for the sensor technology.