Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

That increases the danger of self driving cars over people but there are other situations where self driving is safer.

When discussing this I don't hear the positives



sort by: page size:

Human drivers already kill people at scale. Enormous scale. Far greater than that of the incidents of self-driving cars.

Ultimately it will come down to, who is safer?


In 2018, road traffic deaths is at 37k in America.

If a company invented a self driving car that kills 1,000 people a year, it will never gets allowed on the street. Even 100 a year seems high. But it will actually save tens of thousands.

Why are we a lot stricter on self-driving car technology than on humans? Why can't we simply choose the option that saves more life?


self driving cars have a theoretical and unproven potential to save lives, but are proven to fail, require human assistance, etc. There's no certainty self-driving cars will ever be safer, and we shouldn't let companies kill people to get feedback for improvement!

If self-driving cars are marginally better at driving than humans, but people take more rides in them out of convenience, it's easy for the technology to result in a net increase of human deaths.

So if self driving cars killed 10 people by stopping in the road, but saved 1000 by never hitting pedestrians, that would be a bad trade?

Another thing that gets missed is this: With self-driving cars every single fatal accident will be investigated, and the result of that investigation will be used to improve the software on all cars in the network.

In this way the rate at which cars kill people will decline, similar to the systematic decline in airplane deaths over the years.

So maybe it's worth initially accepting a self-driving fatality rate that is slightly above the human-driving rate?


Don't understand the hang up about this. Self Driving cars were always going to kill people. It's not possible to be 100% safe but even a minute improvement over human drivers would save thousands of lives a year.

And what if the self-driving cars end up causing fewer deaths than human drivers?

Self-driving cars could make a big dent in the ongoing loss of a million lives a year to road accidents. As far as I'm concerned, that's up there in the category of things you listed.

Not to dismiss the tragedy of this incident, but it should be expected that self-driving cars kill pedestrians -- just at a rate lower than what's expected from human drivers.

Perhaps there's a better metric to look at, but I'd like to see number of deaths caused per miles driven.

If Uber's self-driving cars are killing more pedestrians than a human driver would, we have a huge problem, but I'd be willing to bet they're at least an order of magnitude safer in this respect.


As they should, given that self driving tech is orders of magnitude less safe and leads to fatalities.

This argument is silly since it assumes that self driving is somehow both the fastest and most efficient way to save people's lives, however we have working technologies we could implement today that could save lives (breathalyzers for cars, measures of visibility plus warnings, etc). Self driving efforts related to saving lives miss the opportunity costs of doing so.

It depends on whose safety is increased and decreased.

Car infrastructure is notorious for focusing on the safety of drivers rather than pedestrians.

If self-driving cars increase safety for drivers and are worse for everybody else I am not sure it is a good trade-off.

In this regard we can adapt the saying about electric cars to self-driving cars: they are here to save the car industry not you.


Tell that to the pedestrian dragged under the car that didn’t know how to behave in an emergency. Or the other pedestrian killed by that other self driving car.

Adding self driving won’t remove unsafe drivers from the road it’ll remove professional drivers who are largely safe and law abiding. They’ll be replaced with moronic robots and the most unsafe people will still be on the road.

You can take your self driving future and shove it up a different timeline.


Right now, as of this moment, self driving cars are much safer than human drivers.

Every accident that happens and person who dies is a death that could have been easily prevented if companies had been allowed to go to market last year.


This seems pretty reasonable and also very possible to achieve. It would be insane to allow a technology on the streets that makes as many mistakes as humans are making. I certainly wouldn’t use self driving cars if they killed 30000 people per year like humans are doing right now. How would you assign responsibility for crashes? Our current system is far from perfect but at least it’s something people understand and know how to navigate. And there are drivers that are better and more cautious than others. So it’s not just an illusion of control.

Every year young drivers die because they were inexperienced and didn't realize they were going too fast to too slow for a certain situation.

Once full self driving is statistically safer than humans how will you not let people use it? It is like saying you would rather have 10 children die because of bad driving skills rather than 1 child die because they were not paying attention at all times.


human driven cars are currently, as we speak running people over on the streets and they have human drivers who don't particularly want to run over other humans. It was inevitable that this happened, and no matter how many people self driving cars run over will be worth it, since it will still be less than what cars are currently doing in terms of death toll.

Wouldn't that make insurance cheaper for autonomous cars and more expensive for human-driven cars since humans would be more of a liability behind the wheel?
next

Legal | privacy