Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

I know these kinds of situations are frightening, but as long as AutoPilot is significantly safer, statistics-wise, over similar driving, it's hard to argue otherwise.


sort by: page size:

A human and Autopilot working together is safer than just a human driving. Autopilot by itself is currently less safe than just a human driving (which is why it's still level 2). There's no mixed messaging.

If autopilot is slightly better than a human driver, why not use it?

It's an incredibly low standard, sure, but more lives are saved, and it only to get better from then on.

The only problem is the public's perceived unwillingness to accept such risks in using autopilot.


Autopilot has Prevented far more accidents than it has caused.

So you say. Autopilot has already been responsible (in part) for at least one death, and hasn't seemed to slow adoption or interest at all. Look at driving itself: there are all sorts of horrible accidents that we see on the side of the road, but we still get behind the wheel every day. People are very willing to accept risk for convenience.

Poorly maintained autopilots would still be safer than poorly maintained drivers.

Autopilot is at most evidence that human-machine systems are better than just human drivers. Throughout its history "autopilot" had many cases where the human saved its ass and quite a few where it managed to kill the human anyway.

And yet, according to the NHTSA, cars are still much safer with it than without it. You seem to be smack-talking rather than offering a reasoned analysis of whether Autopilot is better than no-Autopilot.

So it's a tossup: either Autopilot will save you or kill you.

Well if AutoPilot is statistically saver then a human pilot (I don't know if it is). Then that's a reason to use it.

I don't think that's the correct answer. Flight autopilots have lowered accident rates tenfold, but every time there is a crash while on autopilot Boeing/Airbus will ground every single plane of that type and won't allow them to fly until the problem is found and fixed. If I know that my car's autopilot is statistically less likely to kill me than I am to kill myself, I would still much much rather drive myself.

I'd accept the claim that autopilot is dangerous if it had brought a car into a state that an alert/situational-aware and normally skilled driver could not recover from. That's not what happened. These people have died because they thought they had got better things to do than monitoring the traffic.

For every tragedy, there are tragedies avoided. I can attest to a few. In the last 10,000 miles, Autopilot has: safely swerved to avoid a car that would have sideswiped me, preemptively braked after detecting the 2nd car ahead (not visible to me) had slammed on its brakes, and avoided colliding with a completely stopped vehicle in the center lane of the freeway.

And FWIW I've never felt misled about Autopilot's capabilities. I started off skeptical and it's since earned my partial trust in limited scenarios. Ironically its hiccups actually make me more attentive since, ya know, I don't want it to kill me.


It's perfectly reasonable for me to make a determination and offer my first hand experience about Autopilot's capabilities relative to that of my own and that of other drivers I see on the road.

Accidents might be long-tail phenomenon but driving habits are observable. Autopilot has a consistent, safe, alert demeanor on the road. It reacts to vehicles I as a driver can't as quickly see, changing situations that it would take me an extra half second or more to react to.

Compare that to the average human driver on the road - a third or more of whom you can expect to see holding a phone - and I think it's a reasonable conclusion to say it's already at least marginally better. And if we are talking about accidents, you can't deny that people are generally terrible at choosing the best possible reaction to an emergency situation that happens in the space of a few seconds.


The driver has ultimate responsibility for operating their car safely. Autopilot does not change that, it says so in the manual.

It doesn't matter, the point is that relying on autopilot will open up crash risks that do not exist with manual flight controls

At this point I'm starting to be willing to consider it fairly safe when the driver is paying attention. Every press article about an autopilot crash seems to involve somebody who is completely inattentive. I can't recall any published incidents involving a driver using it as an assist, while continuing to pay attention.

Do autopilots need to be safe or just safer then people?

A comparison also needs to include the fact that Autopilot is really only used on highway like streets, which are much safer than most other types of driving.

How can you say autopilot is safer when the human driver has to take over when it sees the autopilot isn’t up to the task? Isn’t the prima fascia evidence that the human driver is safer?
next

Legal | privacy