Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

It would be massive news, and we would still hear about it.

(There has been one death so far, with Uber behaving super irresponsibly: https://www.jefftk.com/p/uber-self-driving-crash)



sort by: page size:

The news would be different. This news item is focused around the driver and safety. They are slowly changing their messaging from being about self-driving to being about safe-driving.

They can't come out right away and say "guys we failed at self driving, it doesn't work", that would tank the personal careers of everyone important in the company. They have to slowly move the needle away from self driving, and then they'll give us the bomb drop, "we could only build better safety systems, not self-driving systems" and nobody will care because they are happy with safe-driving.

If they truly had self-driving tech, they'd be doubling down on that with their messaging. Not only to let everyone know there's a new sheriff in town (bye Uber!), but also to let their competitor's know to start their death clocks.


If autonomous drive had been off then Uber would be the first to say so and not be blamed for this death.

Thing is, it wasn't just one incident, it was just one incident that resulted in a death.

When Ubers started self-driving it took just a few hours before there were videos on twitter and youtube of them driving right though red lights without a care in the world.


To put this accident in perspective, Uber self-driving cars totaled about 2 to 3 millions miles, while the fatality rate on US roads is approximately 1.18 deaths per 100 millions miles [1].

[1] https://www.nhtsa.gov/press-releases/usdot-releases-2016-fat...


I wonder how the public would respond if the person killed in that Uber accident a while ago was a kid on a bicycle and how the response will be when/if a Tesla on autopilot plows into a schoolbus (a situation they seem to have problems with is stationary objects partially in a lane).

That's a really good point. I also wonder if the cyclist accident changed the mood and/or culture at Uber when it comes to autonomous vehicles. I have to be honest that it would absolutely gut me inside to know that my software caused a death.

Then why did the multiple deaths caused by Uber and autopilot happen?

It seems like a bunch of what-ifs that normally come up with self-driving cars are about to get answered and precedents are about to be set.

I assume this case will also be one of the most well-recorded cases of a fatal car accident in history as well, given the amount of sensors and equiment on-board a self-driving car, along with eye witness testimony from the operator on board.

Can't tell if Uber has just been incredibly unlucky as of late or if just enough of their employee-base is incompetent as to prevent them from just having a quiet year with no large failures.


Given exponentially-distributed distance between fatalities, this would have a 3% chance of happening if Uber cars were as safe as humans. So it's unlikely.

The numbers that I have seen indicate that humans have a fatal accident roughly once per hundred million miles, while we have one fatal accident for a self-driving car, with somewhere around ten million miles driven across all self-driving cars.

I've heard Uber is around 3 million self-driving miles.

So Uber would be 30x worse than humans.


Currently, Uber self-driving vehicles are clocking in at a fatality rate of about 50x the rate for human drivers.

You can bring your hypotheticals to bear all you like, but this is a serious problem.


What if Uber’s first self-driving car killed a cyclist in its very first mile of operation? Would you find it equally hard to draw conclusions?

It appears that's pretty close to what happened here.

http://fortune.com/2018/03/19/uber-self-driving-car-crash/


You would lose that bet. As another comment pointed out[1]:

> The NHTSA reports a fatality rate of 1.25 deaths per 100 million miles[2], twenty five times the [4 million miles] Uber has driven.

So they should have driven around 75 million more miles before getting their first fatality, in order to remain even with humans. Not to mention they've been driving on the clearest/sunniest roads in only a few cities.

Of course a sample size of one is not enough data, but I'd say we should err on the side of "we have a huge problem".

There should have been so many safety precautions in place that nobody should have died from this yet.

[1] https://news.ycombinator.com/item?id=16620736


Already happened: https://en.wikipedia.org/wiki/Death_of_Elaine_Herzberg

The end result is that Uber ended their self driving car program and paid the family money, while the grunt worker who was put in the car without proper training got charged with negligent homicide and that trial is still going.

The person responsible for that program should be charged, not the safety driver, but that is how things goes. You can't just put random people in a car and call them safety drivers, and charging such a person with homicide doesn't really get to the root of the problem.



Probably just the death of the Uber self-driving program. Although that could perhaps spell death sometime in the future.

An Uber self-driving car killed a person.

I did wonder that, the article didn't actually specify what happened to the Uber driver. Beyond the inevitable heart attack when your stupidity embeds your Honda Civic into the side of a $3m supercar.
next

Legal | privacy