More in

Bad weather dataset could aid autonomous vehicles

A dataset that could help autonomous vehicles see in rain, fog and snow has been created by researchers in Scotland who spent two years chasing bad weather.  

bad weather
Image by Nowaja from Pixabay

The Radiate project led by Heriot-Watt University has published a new dataset that includes three hours of radar images and 200,000 tagged 'road actors' including bicycles, cars, pedestrians, and traffic signs.

To date, almost all the available, labelled data has been based on bright and clear days, so no public data has been available to help develop autonomous vehicles that can operate safely in adverse weather conditions. It has also relied primarily on data collected from optical sensors, which do not work as well during bad weather.

Video of the Week: Autonomous vehicles pass quarry test

Light detector adds awareness to autonomous vehicles

According to Heriot-Watt, Professor Andrew Wallace and Dr Sen Wang have been collecting the data since 2019 after equipping a vehicle with light detection and ranging (LiDAR), radar and stereo cameras, and geopositioning devices.

They drove the vehicle around Edinburgh and the Scottish Highlands to capture urban and rural roads at all times of day and night, purposefully chasing bad weather.

In a statement, Professor Wallace said: “Datasets are essential for developing and benchmarking perception systems for autonomous vehicles.

“We’re many years from driverless cars being on the streets, but autonomous vehicles are already being used in controlled circumstances or piloting areas. 

“We’ve shown that radar can help autonomous vehicles to navigate, map and interpret their environment in bad weather, when vision and LiDAR can fail."

The team said that labelling objects spotted on the roads provides another step forward for researchers and manufacturers.

Dr Sen Wang said: “When a car pulls out in front of you, you try to predict what it will do - will it swerve, will it take off? That’s what autonomous vehicles will have to do, and now we have a database that can put them on that path, even in bad weather.”

The team is based at Heriot-Watt’s Institute of Sensors, Signals and Systems, which has developed classical and deep learning approaches to interpreting sensory data.

They said their ultimate goal is to improve perception capability.

Wallace said: “We need to improve the resolution of the radar, which is naturally fuzzy. If we can combine hi-res optical images with the weather-penetrating capability of enhanced radar [then] that takes us closer to autonomous vehicles being able to see and map better, and ultimately navigate more safely.”

The dataset can be viewed online.