The Dangers of Driverless Cars
Wednesday, May 5, 2021

The concept of driverless cars is here to stay. America is competing in a global race to make driverless cars the norm, and as predicted, nearly all major car manufactures currently offer vehicles with varying levels of autonomy. Today, more people seem to want driverless cars, there is currently little legislation controlling the industry, and the size of the global autonomous vehicle market is projected to be valued at $556.67 billion by 2026. What’s not to love?

While the National Highway Traffic Safety Administration (NHTSA) has designated six levels of autonomy to driver-assisted technology, most consumers are unaware of the distinction. With the current lack of industry standards and legislation, automakers tend to blur the line in their marketing.

Fully self-driving cars (aka autonomous vehicles), or Level 5 AVs, are designed for travel without a human operator, using a combination of sophisticated AI software, LiDAR, and RADAR sensing technology. And technology continues to develop in the hope of making “driverless” cars better and safer.

But how does this play out in real life? Are these vehicles really safer than a human driver who is fully involved and in control?

Are Driverless Cars Safer?

Despite claims to the contrary, self-driving cars currently have a higher rate of accidents than human-driven cars, but the injuries are less severe. On average, there are 9.1 self-driving car accidents per million miles driven, while the same rate is 4.1 crashes per million miles for regular vehicles.

Let’s look at some of the dangers inherent with driverless cars.

False Sense of Security

These cars are often marketed as “driverless,” so is it any wonder when human drivers act more like passive passengers when they operate them?  None of these driverless cars are entirely self-driving, so labeling them “driverless” is misleading, at best. It seems to be true that the vast majority of all accidents involving self-driving cars have been the result of the human driver being distracted, as often happens in a car with no automation.

Yes, drivers are supposed to be alert and ready to take over control at a moment’s notice, but how likely is that when the driverless car was purchased to be, well, driverless?

The most recent fatality involving a driverless Tesla occurred in Texas on Saturday, April 17, when It crashed, killing both passengers, continuing to burn for four hours. According to an article by the Washington Post, the accident is under investigation by the National Transportation Safety Board (NTSB), but police reported no one was driving the vehicle.

Danger of Fire

Lithium-Ion (LI) batteries are well-known to be highly combustible. As lithium burns, it creates a metal fire with temperatures that reach 3,632 degrees Fahrenheit or 2,000 degrees Celsius. Attempting to douse the fire with water could lead to a hydrogen gas explosion.

According to the National Transportation Safety Board, if a collision damages a battery, there is a risk of “uncontrolled increases in temperature and pressure, known as thermal runaway…” This can cause an explosion of toxic gases, the release of projectiles, and fire, presenting an additional danger to emergency responders.

INCIDENTS

  1. The April 17 Tesla crash mentioned above resulted in a fire that lasted four hours and required over 30,000 gallons of water to put it out. A vehicle fire is normally brought under control in minutes, according to the Washington Post.

  2. In 2018, a 2012 Tesla model S appeared to spontaneously catch fire while it was being driven in West Hollywood, CA. There were no injuries in this incident but note that there was no collision that sparked the fire.

  3. In 2018, a 2014 Tesla Model S crashed in Fort Lauderdale, FL, and burned for more than an hour, requiring hundreds of gallons of water to reduce the battery to hot embers. Two people died in this incident, and a third was seriously injured.

  4. In 2017, a driver lost control of a 2016 Tesla X SUV and crashed into the garage of a house (reword). The battery caught fire and spread to the building. Firefighters were initially able to put out the initial flames when the battery flared up again in a “blowtorch manner.” It took several hours to finally get the blaze under control.

Imperfect Technology

A 2020 AAA study found that vehicles equipped with active driving assistance systems experienced some type of issue on the average of every eight miles in real-world driving. They also found that active driving assistance systems, systems that combine vehicle acceleration with braking and steering, often disengage with little notice, requiring the driver to resume control immediately. It’s easy to see how this scenario can end in disaster if the driver is distracted even momentarily or relying too much on the system’s capabilities.

In 2016, an 18-wheeler truck crossed a highway in Florida while a Tesla attempted to drive through it – at full speed. The Tesla driver as a result of injuries received. The car’s autopilot feature failed to brake because it could not distinguish the white side of the truck against the brightly lit sky. The National Highway Traffic Safety Administration determined that the occupant was at fault as they should have had an opportunity to brake before the collision but was likely distracted.

As we reported in our November 7, 2019 blog post, As Cars Grow More Autonomous, Safety Remains an Issue, a man died in a crash due to an Autopilot navigational error. Autopilot is the self-driving function in Tesla cars. The victim had sought repair for the malfunction several times from the dealer.

Cyber Attacks

The threat from hackers during operation is a real one. In 2015, hackers remotely took over a Jeep, forcing it to stop on a St. Louis highway while driving at 70mph. The hackers were able to access the car’s braking and steering through the onboard entertainment system.

The article goes on to explain that this was an “unplanned planned” exercise, meaning that this was part of a test scenario, but the driver did not know precisely how or when the takeover would occur. Nevertheless, the danger he was put in and the panic he experienced served their purpose. Unfortunately, hackers are clever and choose to apply their skills in ways that can be harmful and even deadly.

Complex, Real-Life Driving Conditions

In his book, “Accidents: Living with High-Risk Technologies,” Charles Perrow points out that building in more warnings and safeguards, which is the standard engineering approach to improving safety, fails “because systems complexity makes failures inevitable.” Instead, “adding to complexity may help create new categories of accidents.” Good point, especially when one considers real-life conditions while driving.

Split-second decisions, rapidly changing weather conditions, being able to look into another driver’s eyes at a crossroad – these are real-life conditions best left for an engaged driver. Technology can undoubtedly be enormously helpful; in some instances, some of the new automotive assist technologies can be lifesaving when properly used. But driving is complicated; roads, lanes, and conditions vary, and the same actions aren’t always the best under all circumstances.

Lack of Self-Driving Regulations 

Automakers, industry advocacy groups, and corporations are urging Congressional leaders to enact legislation to allow for “greater deployment of autonomous vehicles” while also calling for “rigorous safety standards” for the new driverless technology. At the moment, there is some existing regulation governing self-driving vehicles, and the number of states at least considering legislation related to autonomous vehicles is gradually increasing.

However, there is a long way to go on that front. In the meantime, car manufacturers, including Tesla, are free to bring their driverless cars to market with very little restraint.

A January 15, 2021 article in GovTech noted that rules that allow fully self-driving vehicle manufacturers to “skip certain federal crash safety requirements” in vehicles that are not designed to carry people was issued by the Trump Administration, a push favored by the NHTSA.

More rules and legislation will likely follow in an effort to speed up the process of getting more self-driving vehicles on the roads. Not everyone is happy with that, however. Safety advocates warn that there need to be rules to protect consumers, including exemptions from regulations designed for vehicles with human drivers.

Are We Moving Too Fast?

Jason Levine, Executive Director of the Center for Auto Safety, expressed concern that the NHTSA is too focused on “enabling the rapid deployment of self-driving vehicles by amending rules written for cars with drivers.” He also noted that “recognizing the unique characteristics of autonomous technology may be the fastest way to authorize the deployment of autonomous vehicles, but it is not a consumer safety-driven approach.”

Other criticism recently aimed at NHTSA by safety advocates concerns the implementation of voluntary guidelines for self-driving vehicle manufacturers. That means they are not required to participate in a reporting system to track how developing vehicles perform in the safety tests recommended by federal regulators. Critics argue that these assessments should be mandatory, and companies required to be transparent.

The United States is projected to have 4.5 million self-driving cars on the roads by 2035. Let’s hope that the automobile companies put consumer safety over profit and the agencies that exist to protect us do their jobs.

As technology and legislation involving self-driving vehicles become more and more complex, so will legal cases. If you or your loved one has been involved in a crash involving a self-driving car, you need an attorney who understands the legal, technical, and legislative complexities.

 

NLR Logo

We collaborate with the world's leading lawyers to deliver news tailored for you. Sign Up to receive our free e-Newsbulletins

 

Sign Up for e-NewsBulletins