Chevron icon It indicates an expandable section or menu, or sometimes previous / next navigation options. HOMEPAGE

The real reason we get freaked out by self-driving car accidents

uber arizona self driving car crash
National Transportation Safety Board (NTSB) investigators examine a self-driving Uber vehicle involved in a fatal accident in Tempe, Arizona, U.S., March 20, 2018. National Transportation Safety Board/Handout via REUTERS

  • Self-driving cars have the potential to save lives.
  • But as the technology is being tested in public, the companies leading in the space need to do a better job at responding when one of their vehicles are involved in an accident.
  • Tesla's response to accidents involving Autopilot — blaming the victim and citing statistics — makes it harder for the public to place trust in the technology.
Advertisement

It's hard to dispute the upside to autonomous vehicles.

Fewer accidents. Fewer deaths and injuries. No more worries about speeding or people driving under the influence of drugs or alcohol. Increased accessibility to affordable transportation in communities that need it most.

So far, the data show that an autonomous future is full of benefits with very few drawbacks. Humans are flawed creatures and make mistakes behind the wheel. And when they do, people can die. Self-driving technology has the potential to save tens of thousands of lives each year in the US alone.

But in the meantime, the companies testing autonomous and semi-autonomous vehicles — the Waymos, Ubers, and Teslas of the world — are setting themselves up for greater scrutiny than traditional automakers with each accident they're involved in.

Advertisement

It's not because their robotic vehicles aren't technically safer than human-operated vehicles. They almost certainly are. It's because when there is an accident involving a self-driving or semi-autonomous vehicle, especially one where there's a death or injury involved, there's an added level of discomfort. It's technology making — or at least contributing to — the accident. It's easy and understandable to blame a human for a car accident. It's not as easy to understand when a car powered by a bunch of algorithms and AI is to blame.

In these cases, it's a company's product that's contributing to death, injury, or property damage. And when a company's product is involved, it's up to the company to take responsibility, not shift the blame back to its own customers.

Tesla is the biggest culprit here.

The handful of accidents involving Tesla's semi-autonomous system Autopilot have happened because the driver wasn't using it properly. (Drivers have to keep their hands on the steering wheel while Autopilot is engaged in case they have to take over, for example.) But the problem with Autopilot is that it blends autonomous driving with human driving, which sets itself up for misuse and error. And as we've seen in a few cases, that misuse can result in an injury or even death.

Advertisement

Tesla's response to each of these accidents has been to blame everyone but itself. The company routinely points out how a driver misused autopilot and that the data and statistics show that Autopilot-equipped cars are far safer than regular cars.

Here's part of the statement Tesla gave after one of its customers, Walter Huang, died while using Autopilot in his Tesla Model X in April:

"We empathize with Mr. Huang's family, who are understandably facing loss and grief, but the false impression that Autopilot is unsafe will cause harm to others on the road. NHTSA found that even the early version of Tesla Autopilot resulted in 40% fewer crashes and it has improved substantially since then. The reason that other families are not on TV is because their loved ones are still alive."

Tesla's rebuke may be technically and factually correct, but it's wrong in spirit and lacks empathy. When one of your users dies, it's not time to blame the victim or the media for covering the incident. It's time to talk about your plan for preventing it from happening again.

Advertisement

But Tesla's stance seems to be to fight back against its critics, naysayers, and the media covering each of these accidents.

Here's Tesla CEO Elon Musk's tweet from earlier this week about an accident involving an Autopilot-equipped Tesla that resulted in a broken ankle:

elon musk Tesla autopilot accident tweet
Elon Musk/Twitter

I'll take a stab at answering Musk's implied question.

The outsized media coverage of these accidents happen because there's an extra level of creepiness when humans aren't involved, or in Tesla's case, not supposed to be involved. There's going to be outrage when a corporation could be at fault for creating a flawed and dangerous product, especially when those products are effectively being beta tested in public where human lives may be at risk.

Advertisement

It's easy to understand when a human driver causes an accident by driving while drunk. We're still exploring the ramifications of a traffic accident caused by a robot.

Even a recent accident involving a self-driving car from Google's sister company Waymo gained a lot of attention, even though the Waymo vehicle clearly wasn't at fault. (A video from the Waymo vehicle's dashboard camera showed a driver swerving over a median and hitting the Waymo vehicle head on.) 

Uber took a better approach recently. After one of its self-driving vehicles hit and killed a woman in March, Uber pulled all of its self-driving vehicles off the road until it could study the problem and figure out what to do next. Maybe Uber didn't need to make such an extreme move, but it was a demonstration that the company was taking responsibility for the accident instead of blaming the victim and citing a bunch of statistics. 

The problem isn't the fundamental technology behind self-driving cars. It's the attitude of the companies operating those vehicles and a failure to come to terms with the publics unease and lack of knowledge of this growing trend. They're not beta testing a new version of iOS or a new Snapchat filter. They're testing vehicles carrying human beings on roads where other human beings drive, walk, and cycle.

Advertisement

And the communication from the leaders in the space should reflect that new reality.

This column does not necessarily reflect the opinion of Insider.

Tesla Waymo Uber
Advertisement
Close icon Two crossed lines that form an 'X'. It indicates a way to close an interaction, or dismiss a notification.

Jump to

  1. Main content
  2. Search
  3. Account