Tesla has recalled 2 million cars, nearly all of its vehicles sold in the U.S. since 2012, because of issues with its self-driving features. Safety regulators have investigated nearly a thousand crashes involving Tesla's autopilot system, which can fully take over steering, braking and acceleration. William Brangham discussed the recall with Faiz Siddiqui of The Washington Post.
The self-driving safety concerns that led to Tesla’s recall of 2 million cars
Read the Full Transcript
Notice: Transcripts are machine and human generated and lightly edited for accuracy. They may contain errors.
William Brangham:
Tesla has recalled two million cars, nearly every Tesla sold in the U.S. since 2012, because of issues with their self-driving features.
Over the last two years, U.S. safety regulators have investigated nearly 1,000 crashes involving Tesla's autopilot system, which can take over fully steering, braking and acceleration.
A Washington Post report found about 40 of these crashes were fatal, including eight where autopilot was engaged on roads where it was not supposed to be used at all.
Washington Post reporter Faiz Siddiqui was one of their journalists who did this analysis.
Faiz Siddiqui, thank you so much for being here. So there's this enormous recall by Tesla.
Can you tell us a little bit about what regulators and your own reporting found was the problem with this self-driving autopilot feature with Teslas?
Faiz Siddiqui, Technology Reporter, The Washington Post:
So what regulators have found here is that Tesla autopilot is essentially able to activate in locations where it is not designed to be used.
This was the subject of a Washington Post report over the weekend. We found there were eight deadly crashes or serious crashes in which autopilot was activated on a road that was not the type of highway, on-ramp to off-ramp, think of an interstate, in which it was designed to be used.
So there are these locations of what the regulators call foreseeable misuse, where someone could abuse the software by activating it, say, on a long and winding road with a lot of intersections, any kind of residential city surface street, where this is really intended — think about cruise control — or an interstate highway or at least a highway with clear demarcated lane lines, exits, a center divider or what have you.
William Brangham:
You focus in particular on one 2019 crash in Florida, a man named Jeremy Banner, where he used autopilot on one of these roads that you were describing as not appropriate for autopilot.
Can you explain what happened in that case?
Faiz Siddiqui:
Yes.
So in the Jeremy Banner crash, this was very similar to a 2016 crash involving a gentleman named Joshua Brown. In the Banner crash, a car was traveling down a U.S. highway when it barreled into and under a semitruck. Banner was killed. The truck was obviously pulling out of a side street as Banner's car was driving along.
And the car actually didn't come to a stop until hundreds of feet later. So this was one of the early instances of autopilot activated in a location where you're wondering, why is this working somewhere a truck could be pulling into? That does not seem like a controlled access highway.
And it also raised this question of, is autopilot struggling to see semitrucks pulling into the middle of the road?
William Brangham:
I mean, Tesla in its defense says every time a driver accesses autopilot, it gets this warning that you're supposed to keep your hands on the steering wheel and stay alert to jump in if needed.
But there's really no way to enforce that a driver does that, right?
Faiz Siddiqui:
There is a way. I don't know if there's a regulatory will.
NHTSA has said it's an issue that is complex, resource-intensive. The regulator would essentially have to impose a sort of a restriction that it doesn't want to be seen as imposing, as opposed to having voluntary compliance, which is where Tesla is saying, hey, it is your responsibility to activate autopilot in a location in which it is safe. We are not going to overburden you with restrictions, because the so-called — the magic of the feature is that it makes it feel like the car is driving itself.
Of course, it is not a self-driving car. So it's more of a question of regulatory will and, I suppose, what sorts of restrictions the company is willing to impose.
William Brangham:
I mean, some critics of the government have argued that to allow these 4,000-to-5,000-pound vehicles to have these features is regulatory oversight.
Do those critics believe that this current step is appropriate or sufficient?
Faiz Siddiqui:
I have spoken with a lot of people in this space, and I would say that the critical analysis of this, and especially the current recall, is that, OK, let's ensure that this is not lip service to the idea that Tesla does something. Let's ensure that Tesla actually does restrict the software to the conditions for which it has been designed.
So one potential vein of criticism here is that NHTSA has now issued a recall. It is a voluntary recall. The ball is now in Tesla's court to impose those restrictions, which might come in the form of notifications, warnings, but they aren't going to be, like — they aren't going to fully restrict the software from operating in these locations.
William Brangham:
All right, Faiz Siddiqui of The Washington Post, thank you so much.
Faiz Siddiqui:
Thank you.
Your browser doesn't support HTML5 audio.
Improved audio player available on our mobile page