Another thing that gets missed is this: With self-driving cars every single fatal accident will be investigated, and the result of that investigation will be used to improve the software on all cars in the network.
In this way the rate at which cars kill people will decline, similar to the systematic decline in airplane deaths over the years.
So maybe it's worth initially accepting a self-driving fatality rate that is slightly above the human-driving rate?
My machine learning professor used to say that people tend to sensationalize self-driving car fatalities, without realizing that if every car on the road was self-driving, deaths would go close to zero.
I think the hard part of the problem is the infrastructure. Everyone wants to fix the cars, but they'll leave the hard part to the government- which is the development of autonomous-ready infrastructure.
In 2018, road traffic deaths is at 37k in America.
If a company invented a self driving car that kills 1,000 people a year, it will never gets allowed on the street. Even 100 a year seems high. But it will actually save tens of thousands.
Why are we a lot stricter on self-driving car technology than on humans? Why can't we simply choose the option that saves more life?
If self-driving cars are marginally better at driving than humans, but people take more rides in them out of convenience, it's easy for the technology to result in a net increase of human deaths.
Every year something like 30,000 people are killed in car crashes in the US. Say the introduction of self-driving cars cuts that number in half and also say that of those 15,000 deaths, 2/3 of them are due to software or system errors, that's still progress, isn't it?
It's fairly easy to measure, fatalities per million km driven. My understanding is that the industry aims to achieve 10x lower fatality rate before releasing self driving cars.
I believe that if car crash deaths were reduced to 4,500/year in the U.S. with only self-driving cars on the road — a 90% drop from last year's actual 45,000 — an overwhelming majority of Americans would still reject self-driving cars because of the loss of agency involved.
I think we could start putting self driving cars on the road today that would lower both the number of fatality's and accidents. We don't need more processing power, better sensors, or lower costs. What we need is slightly better software and the willingness to put it into production.
After-all the 'worst case' in a car is basically solved 99.9% of the time by staying in the correct lane, obeying stop lights / signs, speed limits, and simply hitting the breaks if your going to hit something. Sure, you could improve on that, but get that to work reliably and your already doing better than human drivers, who get distracted, drunk, tired, impatient, angry, and just plain overwhelmed.
As long as the U.S. personal-injury legal system remains in place (with strong support from media coverage of fatal crashes), it won't be enough for self-driving systems to have lower death rates/mile in aggregate. They will need to be better or equal to human performance in every subcategory of driving. That's really hard.
To wit: It's not enough to say "Our self-driving cars avoid 14,000 drunk-driving, texting and asleep-at-the-wheel fatalities," if it's also true that: "Our self-driving cars hit and kill 30 errant pedestrians a year that a human driver would have noticed."
I know that 14,000 > 30. But the specter of roadway martyrs being murdered by killing machines needs only a few examples to sustain itself.
I think the public will accept a small number of human fatalities, if they arise from things like people crossing the street at night away from an intersection. Most drivers have had close calls like that themselves. What I think will really set self-driving cars back is if they kill someone in a situation where a human driver clearly would not have. If a self-driving car runs over a six year old girl standing in a street because she's wearing an oddly striped dress that the car classifies as a sewer grate - that will be a disaster for the industry.
In North America, people kill 33000 people a year in vehicle accidents. Your edit is so wide of the mark, bad driving by humans is a major cause of death. Self-driving cars don't have to be perfect, they just have to be better than bad human drivers.
Given that automobile accidents are currently one of the most likely ways that you will meet your demise, would you agree that decreasing the likelihood of accidents is a worthwhile pursuit?
As I understand it, increased safety is one of the primary motivators for self-driving cars. I think it's fairly obvious that this goal has not been realized yet. But it's one motivator for a lot of people, one which you don't seem to acknowledge.
self driving cars have a theoretical and unproven potential to save lives, but are proven to fail, require human assistance, etc. There's no certainty self-driving cars will ever be safer, and we shouldn't let companies kill people to get feedback for improvement!
Self-driving cars will not be able to catch on without legislation limiting damages to manufacturers.
Proponents of self-driving cars predict ~90% reduction in fatalities. What this means is that over 3000 people per year will be killed by self-driving cars. This is way better than what we have now, but in fatal car crashes, often the driver at fault is killed, and juries tend to assess much lower damages against dead people than against large corporations.
Combine that with the fact that there typically aren't any damages at all awarded for single-occupant single-car collisions where the driver is at fault, and it seems entirely possible that the total damages awarded for traffic fatalities could stay at the current level, or even go up, leaving the manufacturers of self-driving cars to foot the bill.
Now, I generally think that laws to cap damages are not good policy, as it does make it harder to discourage negligence or even malfeasance.
Not to dismiss the tragedy of this incident, but it should be expected that self-driving cars kill pedestrians -- just at a rate lower than what's expected from human drivers.
Perhaps there's a better metric to look at, but I'd like to see number of deaths caused per miles driven.
If Uber's self-driving cars are killing more pedestrians than a human driver would, we have a huge problem, but I'd be willing to bet they're at least an order of magnitude safer in this respect.
In this way the rate at which cars kill people will decline, similar to the systematic decline in airplane deaths over the years.
So maybe it's worth initially accepting a self-driving fatality rate that is slightly above the human-driving rate?
reply