In 2018, road traffic deaths is at 37k in America.
If a company invented a self driving car that kills 1,000 people a year, it will never gets allowed on the street. Even 100 a year seems high. But it will actually save tens of thousands.
Why are we a lot stricter on self-driving car technology than on humans? Why can't we simply choose the option that saves more life?
You can reject my premise, but it is sound. If self driving cars only ever killed people that jumped in front of them deliberately, that's a resounding success, that means they never accidentally kill anyone. So if self driving cars swerve more efficiently to avoid hitting suicides but are worse at stopping at a crosswalk for children, thats an abysmal failure, they accidentally kill people more than humans do. Your take - that any amount of decrease in death is sufficient - is excessively reductionist because it doesn't actually measure how much safer the car is than a human driver. For it to be a reliable measure we have to accept the premise that the same proportion of pedestrian deaths will be due to deliberate action or negligence of the pedestrian, an unfounded premise, much more absurd than mine, and one that ignores the ruthlessness with which machines execute their instructions. Your hint attempting to emotionally sensationalize your argument does nothing to make it more convincing.
Not to dismiss the tragedy of this incident, but it should be expected that self-driving cars kill pedestrians -- just at a rate lower than what's expected from human drivers.
Perhaps there's a better metric to look at, but I'd like to see number of deaths caused per miles driven.
If Uber's self-driving cars are killing more pedestrians than a human driver would, we have a huge problem, but I'd be willing to bet they're at least an order of magnitude safer in this respect.
Not at the cost of human lives. Yes, self driving cars may get rear ended more frequently etc, but low speed collisions are not nearly as important as pedestrian fatalities etc.
I am perfectly happy to accept autonomous cars to kill even a little more people than human drivers do in exchange for the utility that they offer. You can use your time in the car to do other things. That's actual life-hours saved.
And the promise is that they're only going to get more reliable with time.
Conversely, keeping back self-driving cars just because they upset your sensibilities who specifically should die, i.e. random group A instead of random group B even though A is larger, means you're effectively advocating that more people should die. That's grossly negligent.
Human drivers kill more than 30000 people a year in the US alone. It's not about saving "even one life", self-driving cars would have to be pretty terrible to reach those numbers.
Instead let countless drivers kill pedestrians if such cars turn out to be safer to everyone around?
The ethics of this escapes me. It is not a nuclear warhead or a biological weapon. We can decide utility of self driving cars either pre- or post-factum, but not given no data.
This is a classic ethical dilemma. Self-driving cars will kill some amount of people via malfunctions, accidents, etc. Some amount will be children. Should we ban the development of the technology? What if it only killed 1 child? What about 10? What about 100? How many dead children is the benefit of self-driving cars worth?
human driven cars are currently, as we speak running people over on the streets and they have human drivers who don't particularly want to run over other humans. It was inevitable that this happened, and no matter how many people self driving cars run over will be worth it, since it will still be less than what cars are currently doing in terms of death toll.
What if self driving cars kill less people, but the type of people they kill are different from the type of people who die in human driving accidents?
For example, what if instead of 100 people per day dying from human-driven car accidents(where 95 of them are car drivers/passengers, and the other 5 are pedestrians/bicyclists), self-driving cars only kill 30 people per day, but 28 out of 30 are pedestrians/bicyclists?
The problem is that it's a different set of people who will die with self driving cars. Imagine the trolley problem but instead of 5 vs 1 it's 30,000 vs 29,000. The 30,000 people get hit by the trolley and die if you don't pull the lever, and a completely different set of 29,000 people die. People who did not do anything wrong. People who had no chance to fix the situation. Sure, it's 1,000 fewer people, but none of those 29,000 people did anything wrong. They were all killed by the self driving systems made by car manufacturers.
Regardless of whether or not you personally would pull the lever, you have to admit it's not the same as saving 1,000 lives.
If self-driving cars are marginally better at driving than humans, but people take more rides in them out of convenience, it's easy for the technology to result in a net increase of human deaths.
no, it would only mean less deaths per million miles driven if those worse than self driving cars switch to it. If the best drivers switch to self driving cars, there will be more deaths per million miles. Duh
I think the public will accept a small number of human fatalities, if they arise from things like people crossing the street at night away from an intersection. Most drivers have had close calls like that themselves. What I think will really set self-driving cars back is if they kill someone in a situation where a human driver clearly would not have. If a self-driving car runs over a six year old girl standing in a street because she's wearing an oddly striped dress that the car classifies as a sewer grate - that will be a disaster for the industry.
reply