HB Ad Slot
HB Mobile Ad Slot
Tesla Crash: Don’t Slam the Brakes on Autopilot
Tuesday, August 9, 2016

After 130 million miles driven without a fatality, Tesla Autopilot’s perfect track record ended tragically on May 7 with the first fatal crash of a car using Autopilot. Given the infrequency of fatal crashes involving autonomous vehicles, why are commentators suggesting that the auto industry “put the brakes” on this technology?

That’s unclear, especially with the facts here. Autopilot has a better safety record than human drivers. Overall, drivers in the United States cause one fatality roughly every 93 million miles. This was Autopilot’s first fatal accident in over 130 million miles driven.

Moreover, the May 7 accident involved two human drivers. The crash was a relatively standard side impact (T-bone) collision. A tractor-trailer travelling westbound turned left, crossing the eastbound lanes to turn onto a side street. It passed in front of the eastbound Tesla driven by the victim, which struck the trailer. Based on the limited available data, it appears that both human drivers were at fault under Florida law: the tractor-trailer driver for failing to yield to the oncoming Tesla and the Tesla driver for failing to spot the trailer or avoid it.

In fact, the NHTSA may classify this crash as one caused by driver error, the cause of 94 percent of all accidents. Under the NHTSA system for classifying crashes, the “critical reason” for a crash is “the last event in the crash causal chain.”

In this crash’s causal chain, there were three distinct errors. First, the tractor-trailer driver made an illegal left turn in front of oncoming traffic. Second, the Tesla Autopilot didn’t detect the collision risk posed by the turning truck. Finally, the Tesla driver didn’t intervene to prevent the accident. Because it was the last event in the causal chain, this final human error would be the “critical reason” for the accident. Although Autopilot failed to detect the risk and properly intervene, crashes are almost always caused by a chain of failures, and human errors were significant causes of this one. A related NTSB investigation has yet to assign blame for the crash, but the facts outlined in its July 26 preliminary report agree with this chain of events.

The presence of human error here also refutes claims that autonomous vehicle technology shouldn’t be used because it “isn’t safe yet.” In product liability disputes, courts frequently use a risk-utility test to determine whether a product’s design or warning is defective. Under this test, a court would likely consider whether the economic costs of autonomous vehicle technology exceed the cost of redesigning the product plus any loss of use from the redesigned product.

The question, therefore, is not whether autonomous cars are risk-free. Nothing is. But autonomous car technology shows enormous utility and not an inordinate amount of risk.

HB Ad Slot
HB Mobile Ad Slot
HB Ad Slot
HB Mobile Ad Slot
HB Ad Slot
HB Mobile Ad Slot
 

NLR Logo

We collaborate with the world's leading lawyers to deliver news tailored for you. Sign Up to receive our free e-Newsbulletins

 

Sign Up for e-NewsBulletins