Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Well, it depends on your viewpoint. Just today somebody celebrated a fusion Q of 0.005, while each year since many now Tesla‘s Autopilot pulls off more accident-less miles than humans.

Sure, it‘s apples to oranges, but for self-driving, we‘re solving the last (and obviously hardest) percent, while for fusion, we‘re still stuck solving the first.



sort by: page size:

They haven't demonstrated that in a like-for-like comparison. Tesla's autopilot, in circumstances where it can be engaged, and with human supervision, has more miles between accidents than average human drivers in all circumstances (such as older cars, worse roads, and worse weather). But in the circumstances where Autopilot can be used humans are also much better than that average.

I’d say if we compare Tesla self driving numbers with human drivers driving similar cars, and count every disengagement as an accident, it would be a fair comparison. Tesla wouldn’t come out looking very good with those numbers though.

This is probably a large underestimate given that Tesla tends to be very aggressive on not sharing all their data and not calling things autopilot related when they probably should be.

Than being said, even if we 10x this I would guess that number is still pretty low. The comparison should not be relative to 0, the comparison should be relative to number of equivalent human driving hours. How many teslas on the road are out there, how many hours/miles have they driven, and how does that stack up to average human driving in comparison?

Obviously those stats can be gamed, but I'm generally inclined to say that as long as there isn't some crazy increase then its a step in the right direction.


The gross human fatal accident rate is ~7 accidents per billion miles in the US, including fatalities caused by incompetent or irresponsible drivers, and substantially lower in Europe. But humans drive a lot of miles

Based on Tesla's safety report, 'more than 1 billion' miles have been driven using autopilot. Given the small data sample and the fatalities already attributed to autopilot, I think we're some way from proving it's safer than letting drivers drive alone, never mind close to being a driver substitute.


Tesla or Elon recently announced that the accident rate for autopilot is half of human driving. I didn't read past the headline but I'd hope such a statement was based on a comparison that controlled for the type of driving environment and didn't exclude accidents that occurred moments after autopilot gave up and threw the wheel to the driver.

I remember seeing a statistical analysis here on HN that said the numbers for Tesla autopilots are neither great compared to drivers of Teslas nor do they seem to be fair. (They found a case where a human driver had a crash in what would have been counted as 0 miles in the analysis, indicating that something is inflating the "crashes per miles" metric)

I think the statistics to compare self-driving miles vs human driven miles are quite tough to judge.

Tesla was criticized quite a bit at one point for comparing deaths per Autopilot mile to deaths per all motor vehicle miles. This was a bad comparison because motor vehicles included motorcycles, as well as older, poorly-maintained cars, etc.

Then Tesla released a comparison between Autopilot miles in Teslas and human-driven miles in Teslas where Autopilot was eligible to be engaged. This felt like a much more fair comparison, but Teslas are lenient about where Autopilot can be engaged - just because the car will allow it doesn't mean many people would choose to do so in that location, so there might be some bias towards "easier" locations where Autopilot is actually engaged. There's also the potential issue of Autopilot disengaging, and then being in an accident shortly afterwards.

This is morbid, but I also wonder about the number of suicides by car that are included in the overall auto fatality statistics. If someone has decided to end their life, a car might be the most convenient way (and it might appear accidental after the fact). That would drive up the deaths-per-mile stat for human drivers, but makes it tougher for me to decide which is safer - Autopilot driving or me driving?


Well, you can do some rough math by using Tesla's own numbers.

With autopilot: 1 crash in 4.31 million miles driven

Without autopilot: 1 crash in 1.59 million miles driven


Tesla releases the miles per crash rates quarterly, for autopilot and non autopilot cases. Autopilot crashes include anything within 5 seconds of disengagement. The human rate tends to be more than 2x worse than the autopilot rate. This is not normalized for factors like road context.

The human rate for tesla driven miles tends to be ~4x better than the other brands' average. To precisely answer this question you would want to see both a comparable brand's humans' performance; and probably the split of humans who used autopilot and humans who don't ever use autopilot. We don't have that, but in my personal opinion there's enough evidence to suggest it's probably not a grand conspiracy. I'm of the opinion that autopilot being ballpark on par with other drivers is more than enough to reduce accidents substantially, at scale.

https://www.tesla.com/VehicleSafetyReport


What about "safety level substantially greater than that of a human driver" being the keyword?

Isn't Tesla's autopilot doing better than humans do in the same number of miles? We just all hear about it every time a Tesla is in a crash. If we heard about every human driver car accident, it would be overwhelming.

edit:

from this article: https://electrek.co/2018/03/30/tesla-autopilot-fatal-crash-d...

In the US, there is one automotive fatality every 86 million miles across all vehicles from all manufacturers. For Tesla, there is one fatality, including known pedestrian fatalities, every 320 million miles in vehicles equipped with Autopilot hardware. If you are driving a Tesla equipped with Autopilot hardware, you are 3.7 times less likely to be involved in a fatal accident.


On a per-mile-driven basis, Tesla's self-driving accident rate is somewhere in the region of 5-10× that of human drivers, IIRC. And keep in mind that the self-driving here is already in a state of selection bias where it's more likely to be used in safer conditions (i.e., fully-grade-separated highway driving in clear conditions, rather than dense urban environments in inclement weather).

The second of your links says "According to the folks over at Jalopnik though, the bar for “safer than a human” is probably a lot higher than one might guess. In fact, after crunching numbers from the NHTSA, it seems as though drivers avoid crashes 99.999819 percent of the time. If Tesla’s data is to be believed, it’s already surpassed that."

Are you even making a statistically significant comparison at this point?

And there are these also: https://www.theverge.com/2023/5/25/23737972/tesla-whistleblo...

Quite apart from the fact that this isn't actual self-driving that's being measured (as sibling points out).


Close is not the same and neither is the same as 1/134 million. Of course I have no idea if the differences are statistically significant, and it's likely impossible to tell given the lack of data on Tesla autopilot at the moment (new).

I am open to studies as rigorous as you demand indicating Autopilot/FSD is more dangerous than human drivers. But until those are produced, total miles and total accidents seem to be pretty objective, and easily tracked, numbers. There's really no magic there. And on that count, Tesla makes very safe machines.

Sample size is too small to tell. However, if you ignore that, Tesla's Autopilot is at least 2-3 times worse than the average human in a passenger vehicle. And it only gets worse if you restrict further to middle aged people in expensive vehicles.

Source: http://www.greencarreports.com/news/1106613_how-safe-is-tesl...

Stats are on low page 2 and high page 3.

TLDR: Tesla compared their numbers to a statistic that includes much more dangerous forms of transport(bike, 18-wheelers, whatever). The Insurance Institute for Highway Safety did a study on just driver fatalities in passenger vehicles(cars and light trucks) and came up with 1 fatality / 438 million miles driven. Versus the Tesla figure of 1 / 130 million miles.

Whole article is worth reading as it goes into more detail on statistical issues with Tesla's safety statistic claims.

I'll add that human driving ability is not uniformly distributed. Most accidents are due to particular demographic groups: people that drink and drive, teenagers. It's entirely possible for a self-driving car to be worse at driving than most humans, while still better than the average. In that scenario getting really bad drivers into self-driving cars would improve average accident numbers, but getting non-bad drivers(aka the majority of drivers) into self-driving cars would make average accident numbers worse.


Not sure that this is an apples to apples comparison. The US average includes many driving scenarios that Tesla’s autopilot cannot navigate.

~52% of accidents happen within 5 miles of home. At least where I live, Tesla’s autopilot couldn’t navigate the 5 miles around my house on city streets.

A better comparison may be to compare accident rates for same/similar driving scenarios as the Tesla autopilot. While we all know that humans are bad drivers, perhaps it would would be better to compare accident rates for freeway driving.


The question is: Of Tesla miles driven, Autopilot vs human, similar driving conditions, etc....how do the accident rates compare?

Certainty there are non-AP accidents? There certainly could be less deaths, but for how many miles driven?

I'm not defending Tesla, or AI. This is a simple and basic statistics question.


It's great seeing that more and more data is being collected about this all the time. I'm a huge proponent of this tech.

What I wonder when I see these statistics, though, is whether all miles are really equal? For example, are Tesla drivers more comfortable using Autopilot in "easy" driving situations? Is there really a one-to-one correspondence in the distribution of the kinds of miles driven with Autopilot on vs. normal cars?

Furthermore, the metric commonly cited is "fatalities ever N miles." Are there fewer fatalities because Autopilot is great, or because Teslas are safer in general? Has there been a comparison between fatalities with/without Autopilot strictly on Teslas? Even then, it seems to me we are subject to this potentially biased notion of "miles" I mentioned previously. The Wikipedia article you mentioned cites a 50% reduction in accidents with Autopilot, but the citation is to an Elon Musk interview. I haven't yet seen anything official to back this up, but if anyone has any information on this, I'd love to see it!


I try to remember that we, the public, doesn't have any idea how many miles are driven on autopilot or full-self-driving. We hear about crashes and argue about whether or not FSD/autopilot was engaged or made a warning and disengaged at the last moment but the driver failed to avoid to crash. But we don't know the denominator in the ratio of crashes per mile so we can't directly compare it with human only or other manufacturers' automation products.
next

Legal | privacy