More in

Phantom images trick autonomous cars into braking

Researchers in Israel have used phantom images projected on a road to cause the autopilot on an autonomous vehicle to mistakenly apply its brakes.

In a new research paper, Phantom of the ADAS, published on IACR.org, the researchers from Ben-Gurion University of the Negev's (BGU) Cyber Security Research Centre demonstrated that autopilots and advanced driving-assistance systems (ADASs) in semi-autonomous or fully autonomous cars register depthless projections of objects (phantoms) as real objects. They show how attackers can exploit this to manipulate the vehicle and potentially harm the driver or passengers without any special expertise by using a commercial drone and inexpensive image projector.

According to BGU, fully and semi-autonomous cars are being deployed globally but vehicular communication systems that connect the car with other cars, pedestrians and surrounding infrastructure are lagging. The researchers said the lack of such systems creates a "validation gap," which prevents the autonomous vehicles from validating their virtual perception with a third party, relying only on internal sensors.

In addition to causing the autopilot to apply brakes, the researchers demonstrated they can mislead the ADAS into believing phantom traffic signs are real, when projected for 125 milliseconds in advertisements on digital billboards. They showed also how fake lane markers projected on a road by a projector-equipped drone will guide the autopilot into the opposite lane and potentially oncoming traffic.

In the Ben-Gurion University of the Negev Research Telsa considers the phantom image (left) as a real person and (right) Mobileye 630 PRO autonomous vehicle system considers the image projected on a tree as a real road sign (Image: Cyber@bgu)

"This type of attack is currently not being taken into consideration by the automobile industry. These are not bugs or poor coding errors but fundamental flaws in object detectors that are not trained to distinguish between real and fake objects and use feature matching to detect visual objects," said Ben Nassi, lead author and a PhD student of Prof Yuval Elovici in BGU's Department of Software and Information Systems Engineering and Cyber Security Research Center.

Autonomous vehicles on bumpy road to market

In reality, depthless objects projected on a road are considered real even though the depth sensors can differentiate between 2D and 3D. The BGU researchers believe that this is the result of a safety policy that causes the car to consider a visual 2D object real.

The researchers are developing a neural network model that analyses a detected object's context, surface and reflected light, which is capable of detecting phantoms with high accuracy.