I wonder if we get skewed opinions about autopiloted cars because every story about them blows up, while people accidentally killing themselves in cars don’t.
There is not even anecdotal evidence. I am 100% sure that the rate of deaths per 1000 autopilot-driven miles is at least an order of magnitude lower than the rate of deaths per 1000 human-driven miles. Tesla has that info and occasionally publishes it... people just like to get freaked out when there is an accident. Sure, someone died doing something dumb with autopilot... but what, 50 people died doing something dumb in a normal car in the time it took them to put out the fire? Why are we even having this conversation? it's ridiculous.
Easy to google "tesla self-driving car death" - but I think the OP was referring to the over-publicized deaths by media searching for clicks. In reality, self-driving cars are already much saver than your average driver.
There'll be many people posting anecdotes of Tesla crashes, and in each case, maybe autopilot was a factor, maybe not. But, in reality, autopilot is just safer than average drivers. Anecdotes of autopilot failures create an emotional response, but nobody talks about anecdotes of deaths due to people driving poorly since it's such a common occurrence. So you also have an asymmetry there.
I'd love to hear a stronger anti-self-driving argument. The mainstream anti-self-driving arguments, I find, are weak. Bring it on HN!
There are some spectacular failures that are very scary because the car does something that a person would never do, unless unconscious, like this one: https://www.youtube.com/watch?v=LfmAG4dk-rU
You don't hear much about it probably because Tesla fanboys are plenty and rabid, so people avoid talking about it online.
The defence is usually stats about human drivers crashing more often and it makes sense until you dive deeper into the numbers because these stats are usually oranges v.s. Apples. It feels like they have some playbook with statistics to slap when someone says something negative. If that doesn't cut, they say that the victim should have followed the manual that says "your attention should always be on the road" then proceed posting a video about how thanks to the latest update they can sleep drive to work and attach a banana to the driving wheel to disable the attention safeguards.
If someone asks how is this an autopilot, there are usually two ways to handle it:
1) Autopilot is just a brand name, the self driving software is in beta, so the victim should pay attention all the times.
2) Autopilot is like on the planes, so only fools think that it is autonomous, therefore it was working as intended but they should have been using it like an airline autopilot. Crash due to user error.
I'm actually a fan of Musk and Tesla but I feel like the community engagement is very unhealthy and lacks scrutiny due to the "online army" of his.
I was expecting autopilot to be seriously dangerous, but the data doesn’t back that up. Out of 234 deaths from accidents involving Tesla’s 12 people have died while Tesla autopilot was known to be in use or had recently been in use. http://www.tesladeaths.com Autopilot might be slightly worse than human drivers depending on how you slice the data but that’s about it.
I don’t think it’s a regulation problem as much as it is an acceptance by regulators that driving is inherently dangerous and autopilot isn’t dramatically worse. It’s not even obvious if on net more people would have died if Tesla had never released autopilot.
But is it safer than the BMW system that does the exact same thing?
I also seriously doubt it's much safer than driving a normal car in the same situation. Musk threw out a bullshit comparison with general vehicle fatality stats. But Autopilot is used in a situation that's safer than average travel--highway driving in good weather. Most deaths happen at intersections--not highway.
How many of those 15 deaths are in a car that costs as much as a Tesla, and therefore can be expected to have the same safety features? How many are in 25 year old cars? If you claim that driving a Tesla on autopilot on a freeway is safer than a 30 year old $2000 clunker, I won’t challenge that. If you say it’s safer than a $50,000 Audi that’s highly debatable.
Not only that, but how many times would autopilot have caused a fatal accident if the human driver hadn’t intervened? Tons of YouTube examples would suggest quite a few.
I thought the car crashing into the parked police car, a parked fire engine, decapitated, and driving straight into a road divider at full speed were better proof that people are mislead by the term "autopilot."
Tesla likes to play both sides, it is "autopilot" when it works, but when it doesn't then the driver has literally under six seconds (according to Tesla) to avert a fatal accident.
The thing is, a tesla crashing while running on autopilot is more newsworthy than a tesla crashing while being driven. We should not let anecdotal evidence drive our fears.
You have to compare the one death using autopilot to one death of people driving Teslas without autopilot. Musk tried to compare it against the universe of drivers (Teslas, kids driving crappy cars, etc), which was a complete false comparison.
So the reason it was a big deal is because it was a huge fatality. Tesla drivers are generally a pretty safe bunch. Statistically, if autopilot hadn't been engaged, that death would not have occurred. Autopilot makes Tesla drivers less safe, not more safe.
Also, the government is doing self driving industry a huge favor. These fatalities could screw over the whole industry if they get out of hand. Musk is giving self driving a bad name.
This isn't how statistics work. Teslas are only about .1% of all cars in the US, and only a fraction of those have autopilot on. If autopilot is as safe as a human driver, we'd expect to see about 10000 cases of human stupidity for every case of autopilot stupidity.
I am beginning to get frustrated with news surrounding both Tesla autopilot crashes, and self driving car crashes. I understand that we are seeing lots of news about it because it is new, and people are scared that self driving cars are going to kill people. But can you imagine if an article trended every time someone got into an accident while using cruise control?
You can't make that kind of conclusion from a single, extremely-highly-publicized occurrence, even a fatal one. Given how often human drivers kill people, you need statistics to show that autopilot is worse.
A much higher percentage of autopilot cars had fatal collisions at that location than non-autopilot cars. That’s cause for concern. And for all we know the previous driver was distracted. Being not-worse than a distracted human isn’t very confidence-inspiring.
Agreed though it’s also worth noting that every time an autopilot fatality hits the news it’s combined with the driver also not paying attention. Often it’s claimed that the driver isn’t paying attention because of their faith in autopilot. And that is the real issue. Even Tesla themselves say it shouldn’t be relied upon, despite what their marketing suggests.
What I'm trying to say is that it's not appropriate to classify accidents as either caused by autopilot or human error.
The fatal accident was, tragically, a failure of both the autopilot and the driver. But the driver was relying on the autopilot in a way that has never been recommended.
It does seem as though Tesla has been overstating autopilot's safety, based on arguments made in the article and in the comments here. But it seems grossly unfair to judge the autopilot against a use case that it was never recommended for. It's a bit like braking at the last second on a wet road and then blaming your ABS when you crash.
I would love to see some more nuanced analysis of autopilot safety when compared to an appropriate control group.
reply