There were more than 35,000 vehicle-related deaths in the US last year (and nearly 2000 more in Canada). As a result, 43 companies are currently investing in self-driving car technology, with more on the way. Given the fact that self-driving cars can’t text or eat a hamburger or get drunk and be “fiiiiiiinnnneeee, you guys,” the math tells us that self-driving cars will crash less than we all do, all the time.
However, this doesn’t mean people are going to be comfortable handing over complete control to a computer. We’re nothing if not skeptical when it comes to new technology. I mean, push-button phones were invented in the early 60s and it wasn’t until the late 80s that rotary phones even started to phase out.
Nevertheless, self-driving cars are coming, and here’s the uncomfortable truth we’ll all have to face: Not long after the launch of self-driving cars in Canada, there will be an accident. And it will be bad. People will die.
What happens then?
Car Crashes Will Feel New Again
Let’s not kid ourselves. No matter how on-board people are with self-driving cars (because sleeping on the way to work sounds incredible), the first major self-driving car accident is going to freak people out.
“There will be a horrific crash, not long after the vehicles are introduced, because automobiles crash a lot,” said David Groves, a senior policy researcher at the RAND Corporation, a policy think-tank researching the affects of self-driving cars. “We are so numb and tolerant of the crashes that occur by the thousands all around us every year. But the first autonomous vehicle crash is going to be extremely novel.”
As crazy as it sounds, we’re desensitized to car crashes at this point. Hundreds of thousands of them happen in North America every year. They’re dramatized in the movies we watch. Vin Diesel is worth $160 million because of them.
But with all of those car crashes, we’ve had people to blame. It’s not the car’s fault, it’s us. That’s what makes the crashes easy to ignore. The first self-driving car accident, though, is going to feel different because we’ll be in uncharted waters and people will be scared. It’s not going to be our fault (at least not directly), so the blame will have to go to the car itself. Or the people who manufactured it. We’re going to experience an unnerving lack of control.
This is where we’re going to have to remind ourselves to be rational.
Remember How Terrible People Are
If a self-driving car does something inexplicable and needlessly kills someone, that’s a legitimate cause for concern. If it all of a sudden veers off on overpass, then the technology is clearly at fault and we shouldn’t be putting our trust in it. But by the time self-driving cars hit the market, most of the malfunctions will have been ironed out (at least that’s the expectation). At that point, the numbers will tell the story.
If, for example humans cause 36,843 car deaths a year, and driverless cars cause 36,842, it makes sense to deploy them en masse. If self-driving cars are even one single death better, there’s almost no argument to be made against them. And it’s almost certain that putting self-driving cars on the road will lead to fewer deaths than human drivers.
Remember that number from the start of this article? 35,000? That’s just the number of car-related deaths last year. When you consider the number of accidents, period, the number goes up exponentially. People get distracted, their reflexes are too slow, they can be drunk or angry or sleepy. This is why self-driving cars are coming like a freight train. It’s unstoppable because we’re really, really bad at this. According to the US Department of Transportation, 94 percent of fatal crashes are the result of human error.
Let’s Not Kid Ourselves
To be completely honest with you, the thought of a self-driving car accident terrifies me. I hate the idea of losing my own agency, of yielding control to something that could make a mistake I don’t think I would have made. But this is why I have to consider the aggregate. If they’re better in the long run–for everyone involved–then that’s the way it needs to be.
But that doesn’t mean I’m not considering all the ways this whole thing could go terribly wrong.
Consider this: You’re hanging out in your new self-driving car, reading a book or eating bacon and eggs or whatever it is you want to do with your now-30-minutes-of-free-time. A group of people decides to cross the road when they shouldn’t. The car makes a split-second decision to save as many lives as possible, and swerves into a wall, killing you.
Even though surveys show that people would rather sacrifice themselves than others in the event of a car crash, they are not comfortable with this scenario if a self-driving car makes that decision for them.
What do you do with this possibility? This eventuality. Even though the scenario is exceedingly rare, it is bound to happen. Almost everything does, eventually.
A serious conversation will need to happen around the concept of ethics when it comes to self-driving cars. This isn’t just about the logistics of the technology. That’s the easy part.
Self-Driving Cars Are Not Good Enough … Yet
Self-driving cars will literally be programmed to kill. That sounds insane but it’s true.
But here’s the rub: so are you. Your brain (every bit as much a machine as that car you’ll be riding in), makes calculations and split-second decisions and there are scenarios where your brain would rather you die than someone else.
Let’s be clear: self-driving cars are not good enough yet. 43 self-driving car companies are currently developing them (in San Francisco, Tempe, and Pittsburgh, among others), but as of November 2017, they’re not safe.
In a report from 2016, Waymo’s cars averaged more than 5000 miles between accidents. But Mercedes Benz averaged less than two miles. On average, self-driving cars are about as good as a bad human driver.
It’s clear they’re on their way, but wide-scale deployment hasn’t happened and it’s unclear when exactly it will happen. What is clear is that the numbers will have to be published, and pushed, constantly. Even if self-driving cars save thousands of lives, the people involved in self-driving accidents won’t care all that much.
Fifty-six percent of Americans surveyed told the Pew Research Center that they would not want to take even one ride in a driverless vehicle. I understand that. That’s because the technology sucks right now. But it’s technology, it will grow in leaps and bounds faster than you can even imagine.
No one has the answers yet. Which is fine because these are still early days.
However, self-driving cars have the potential to make accidents all but a thing of the past. Even if they don’t, that’s still okay.
If self-driving cars get into 10,000 accidents in the first year they’re available (or 40,000 for that matter), who cares? If it’s less than what people would have done, then we’re in the black. Even with imperfect self-driving cars, RAND estimates that tens of thousands of lives could be saved. “Even though we can’t predict the future, we found it’s really hard to imagine a future where waiting for perfection doesn’t lead to really big opportunity costs in terms of fatalities,” Groves said.
Translation: waiting for perfect driverless cars means 40,000 North American deaths every year.
Self-driving cars are coming. Buckle up.