Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

You've always been at risk of other road users, but that is well understood when you get in a car and drive somewhere.

What's new, and on top of that risk, and not well understood is the risk due to drivers mis-using a half-baked feature named "autopilot" as if it's actually an autopilot, because Musk insists that it works when it clearly does not always work.



sort by: page size:

I don't think anyone was asking "What is the purpose in having a bug in your autopilot?"

I think people were asking "Is your autopilot actually dependable/safe?"

And that latter question is what Musk was responding to.


Indeed, and not only that, Elon Musk's example of Autopilot supposedly saving lives that he famously tweeted a while back was manual driving + active driver assistance in the form of automatic emergency braking (which is supposed to become mandatory on new cars at some point). It happened in conditions far outside what Autopilot was capable of actually driving in.

The vehicle's clearly worded warnings can be at odds with the reality of how they market their vehicles. If you don't believe me, read your car's manual some day. They go to the point of pretty much warning you to never drive.

If a warning makes a feature less than advertised, or is impractical in general, the company has still come short of their duty. The problem with Autopilot is that basic human nature will make you less attentive if it's taking that level of control. You can't warn that away.


It's in beta for a reason, and before using it, you have to accept a disclaimer clearly stating that you should not use it on road without a concrete separation between you and the opposite traffic, and you should not use it on road with sharp turns. This guy was clearly looking for the limit of the autopilot, and found it, nearly at the expense of others.

I think the problem is deeper than 'is Autopilot better than no-Autopilot'. Is Autopilot better than no Autopilot under what conditions? Is Autopilot being marketed to customers in a way that encourages them to use it in the wrong conditions? How are changes to the Autopilot code being vetted before being deployed OTA to customers?

I think if used properly Autopilot can be safer than most drivers without it, but the crux of the issue is that many people will trust the system more than they should, which could make them more dangerous. We're in a dangerous inbetween state with autonomous driving where it's just good enough to fool some people into thinking it's foolproof when it's not.


Autopilot is a driver assistance feature, not a safety feature.

It's called autopilot, not self-driving.

There are other people on the road, it's not just the drivers that are impacted by this.

Calling it "Autopilot" is what's irresponsible.


Autopilot sounds about as safe as driving while intoxicated, which makes Autopilot users massive egoists.

I think one of my main concerns with "autopilot" is that for a lot of drivers, it will absolutely make the roads safer for them and those that use the roads around them. Consequently, for some safer and more-alert drivers, it has the potential to make driving less safe.

>removing radar sensors to transition to a camera-based Autopilot system

A few weeks back, I had a terrible experience while using auto pilot. I was driving on a highway (in CA) with autopilot engaged. For the most part, there was a concrete median on the highway. Suddenly, a section came with no concrete median and a new left only turn lane gets added. For whatever reasons, autopilot thought it is a great idea to suddenly move the steering wheel to left while there is oncoming traffic. I immediately took control of the navigation but the car did wobble a bit. My heart kept racing with an adrenaline rush for the next half an hour. I haven't engaged autopilot since then. I can't trust auto pilot anymore- it couldn't deal with a dead-simple scenario of a clearly marked lane getting added.


After reading about this for a while, just now, I'm much less comfortable with the idea of trusting an automotive autopilot system. Just imagine if your taxi driver told you "I will drive perfectly safely, except if there is a truck parked in the road. Then I will run into it at speed." It seems like such a glaring omission. If I were an engineer I don't think I would be comfortable releasing an autopilot system with this kind of safety issue.

The whole concept is flawed.

We cannot expect humans to be able to pay attention and be able to take over at a moment's notice at all times, simply because that's not how our brains work.

Autopilot is in the danger zone, where it's good enough to make your brain relax, but it's bad enough to require your brain to not do that. So it's fundamentally unsafe.

Cruise control in contrast isn't good enough so your brain will have to pay attention, otherwise you'll crash very quickly.

And this is all made much, much worse by Elon's and Tesla's irresponsible marketing, and the believers who dismiss this as a "worthless discussion".


Autopilot in aircraft is no different; you must remain aware and attentive at all times, it is merely to reduce physical workload. If you make a mistake, you are still at fault.

EDIT: I can't reply to children comments because HN is throttling my posting ability.

It's not Tesla's job to turn driver's into model citizens. They can't fix irresponsible people taking irresponsible risks.


Surely there's some sort of "autopilot does not work on all roads" disclaimer, right? That's why you're supposed to keep your hands on the wheel and pay attention.

> the autopilot is a driving aid and should not be used unobserved.

That's not an autopilot is it. It's driver assist. Like every other car manufacturer calls it (and incidentally, they don't have these problems).


Proof that so-called autopilot systems are a danger to the public would be evidence that they perform worse than the average driver. I haven't seen such evidence, what are you referencing?

There is a simple solution for this. If you think you are an above average driver, don't use autopilot.

The problem, as I see it, is that the longer driver uses the autopilot feature the less attention they will pay to it as long as it doesn't do something nuts like this. My understanding is that this occurred soon after a new 'beta' update to the autopilot system. I'm not sure how this is surfaced to the user, if you need to opt into this new version etc.

My fear is that a new version of the autopilot system could have new bugs introduced that could disrupt the navigation on a route that the user has confidence that the previous version of the autopilot could handle. They commute on route x every day and over time they've gained confidence in the autopilot on that route. This new update however is going to run them into some obstruction due to a bug. That obstruction might be those monorail pylons we see in the seattle video or it might be a crosswalk. A driver who had confidence in the autopilot might be well distracted having confidence in his automation and not be able to correct the situation before tragedy.

IMO autopilot should be banned until it can be proved to be 100% safe. I don't think we can get there until roads are outfitted with some kind of beaconing system that sensors in the car can read and cars on the road are potentially networked together cooperatively... and only then to be enabled on roads with those markers/beacons.

People in this thread deciding that the system is safe because it's no worse than a drunk driver or student driver are missing the point. We absorb those hazards into the system because they present only a handful agents in the system. Out of 100,000 drivers in a rush hour flow, how many are students and/or drunk.. probably very few. However as teslas keep selling with this feature, our new class of hazard agents keeps going up and up and up.

Hell we wouldn't even have to mandate it to be illegal at the political level. Perhaps the insurance industry will do it for us.

next

Legal | privacy