Fatal Tesla crash triggers investigation

The US National Highway Traffic Safety Administration is launching a preliminary investigation into the Autopilot function on the Tesla Model S following a fatal accident in Florida.

tesla-model-s-autopilot-software-70

Autopilot, introduced to the Model S in October 2015, forms part of Tesla’s incremental rollout of technology to facilitate driver autonomy.

According to the company, the hardware consists of a forward radar, a forward-looking camera, 12 long-range ultrasonic sensors positioned to sense 16 feet around the car in every direction at all speeds, and a high-precision digitally-controlled electric assist braking system.

Tesla said that the accident occurred on a divided highway with Autopilot engaged when a tractor-trailer drove across the highway perpendicular to the Model S.

In a statement the company said: “Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.

“The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S.”

The company stressed that the fatality is the first to be confirmed in over 130 million miles where Autopilot was activated, and that numerous measures are in place to ensure that drivers are ready to resume control of their vehicle.

“The system also makes frequent checks to ensure that the driver's hands remain on the wheel and provides visual and audible alerts if hands-on is not detected,” the company said. “It then gradually slows down the car until hands-on is detected again.”

Prof William Harwin, Professor of Cybernetics at Reading University said: “It would appear that the accident resulted from a problem in the performance of the sensors rather than the autonomy of the vehicle. As such, this is a more well understood problem and there may be a relatively easy solution in terms improving sensor integration, or including additional sensors that are less dependent on light.

“However, the bottom line is that cars are likely to be safer with these automatic features, and ultimately with vehicles that can drive autonomously.”

Speaking at an event on the ethics of robotics at the Royal Academy of Engineering this week, Prof Alan Winfield, head of swarm robotics at Bristol Robotics Laboratory, said that the automotive industry could learn from aerospace to deal with situations like this.

“I think autonomous vehicles are going to need something like the Civil Aviation Authority to oversee them,” he said. “The CAA’s crash investigation unit has robust methodology which is highly effective, and it helps people have confidence in air travel. That model could do the same for autonomous cars."