But that surely is the description of 1% of the drivers. It should be better than the 1% worst drivers. It should be demonstrably safer than humans on average with a good margin.
That's a big if. And as discussed recently on HN, it can't just be safer than average human drivers, it has to be safer than good human drivers. That is a pretty tall order.
Problem is your the average for human drivers is brought down mostly by bad drivers. However, even the if we talking about averages 50% of people are better than average. I would not feel to confident in a self driving car that is only approximately as good as 1 out of 2 drivers. I would want the car to be better than high upper percentile. Especially knowing an automated system can have reaction times that can put any humans reaction times to shame.
So yes equal safety is far from enough. Especially considering 50% would be better drivers than a self driving car that was only as good as the average driver. Your asking 50% of people and some percentage of people who overestimate their abilities to trust a car that would perform worse than them.
As long as it's statistically better than a human driver, it's all good. Human drivers fail all the time, resulting in mass casualties in every single country.
And what about the claim that the rest of the time the self-driving car is safer than a human? How do we know that self-driving cars are safer than cars driven by a human?
I don't see this line of reasoning. To me "safer than humans" or "in the 90th percentile among drivers" is a fair go to market line.
Optimally safe should still be the goal. That is, no human could have provided alternative input to the computer that would have created a safer outcome.
People often say self driving cars have to be better than humans but they often neglect to ask "what % of the population does it have to be better than?".
Today's self driving cars are already better than some drivers. You don't have to look farther than r/IdiotsInCars. I would say that if self driving cars are better than 50% of the population, that would be good enough. Removing most of the accident prone individuals would be greatly beneficial already.
Most people (including me) think they're better drivers than average. I think I'm probably in the top 10%, at least. Whether or not that's actually true doesn't matter for my argument.
All that considered, I need autonomous vehicles to be MUCH better than average human drivers to consider riding in them. They need to be better than I think I am, so maybe top 5% or so of talent on the road.
1% better than the average human is still an awful driver IMO.
How much higher than 0% though? Human still have a high number of accidents, but that's because we drive a lot. The 0% error for human driving is very, very, very, very low.
I didn't say "best driver" I said better than a human - even an average human.
I'll copy/paste something I wrote before:
The computer would have to be 99.99999% reliable to simply match humans.
The accident rate is around 74 per 100 million miles (and fatalities is 1.13).
It's unclear exactly how to turn that into a percentage, but no matter how you do it it's quite high.
Say an accident takes 5 minutes, and people drive 30 miles/hour. Then that works out to 99.999% for humans. If you use the numbers for fatalities then it's 99.99999%.
I.e. 99.99999% of the time, as whole across all [US] humans, people drive in a way that does not cause a fatality.
Do you know any computers that can simply run with 5 9's of reliability? Never mind actually do something complicated?
If it's better at avoiding harm, across the board, in all situations where humans currently operate vehicles, then yes, that would be an improvement. But Tesla is nowhere near being able to do that.
Tesla seems to be trying to claim that it's sufficient for them to be a few percent better at, for example, keeping the car in its lane, or maintaining speed, or spotting obstacles. That's not an improvement. Driving is not a contest to see who can identify the most obstacles or maintain speed or lane alignment the best. People want to get where they're going safely.
> The threshold should be fewer accidents and/or fewer fatalities per million miles driven than human drivers.
Those of us who drive defensively strongly disagree with this metric. The "average driver" is extremely skewed towards a relatively small subset of drivers who cause most of the wrecks. Not coincidentally, the same people who can't afford a fancy autonomous car anyway.
I absolutely disagree. It can't be better than humans. It needs to be flawless. Any problems where people get killed will delay public acceptance of the technology by decades, even if "statistically it's safer than humans". People don't give a damn about statistics, they give a damn about tabloids shouting "self driving cars kill another innocent person!!!". We literally can't afford that.
reply