Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

I had a scary experience in a Model S on auto-pilot going over a bridge where the car swerved left and almost hit the concrete barrier. Not entirely sure what happened but since then, I've been hesitant to use auto-pilot 100% of the time.


sort by: page size:

My first drive with auto pilot tried to follow into the back of a stopped car in a turning lane and randomly had phantom breaking issues like going under overpasses.

Not to mention there are things most people do defensively while driving that autopilot doesn't: anticipate a vehicle coming into my lane by both looking at the wheel position and the lateral movement. Autopilot ignores those pieces of information and data until it's in basically in your lane.

Personally I feel I have to be more on guard and attentive with it on because I know their are fatal bugs.


I drove a friend's Model 3, and within five minutes of driving on autopilot it got confused at an intersection and tried to make a pretty sudden 'lane change' to the wrong side of a divided road.

Obviously that's a single anecdote, and I don't know if it would have gone through with it because I immediately corrected, but that was my experience.


Bunk. Autopilot makes very strange decisions during normal operation -- decisions that normal human drivers would NOT make, such as merging into an entrance lane on the freeway. This in and of itself is enough to confuse and befuddle other drivers on the road, which makes it more surprising and dangerous than a human driver. Worse, it will actively fight the driver at the helm to make these decisions. Source: test drove a model 3 and it did just this and nearly caused an accident during the test drive and I could not instruct it to stop.

>removing radar sensors to transition to a camera-based Autopilot system

A few weeks back, I had a terrible experience while using auto pilot. I was driving on a highway (in CA) with autopilot engaged. For the most part, there was a concrete median on the highway. Suddenly, a section came with no concrete median and a new left only turn lane gets added. For whatever reasons, autopilot thought it is a great idea to suddenly move the steering wheel to left while there is oncoming traffic. I immediately took control of the navigation but the car did wobble a bit. My heart kept racing with an adrenaline rush for the next half an hour. I haven't engaged autopilot since then. I can't trust auto pilot anymore- it couldn't deal with a dead-simple scenario of a clearly marked lane getting added.


Model S and X owner here. Autopilot will nag you after so many seconds (depending on vehicle speed and autopilot path planning confidence) if it doesn’t detect steering wheel torque, and if you ignore the nags, it brings the vehicle to a stop safely with the hazards on.

Car and Driver tested AEB in several vehicles (Tesla, Toyota, Subaru) and they are all pretty equally bad at it. Figuring out if matter in front of you, while traveling at speed, poses a life critical hazard is hard! It’s better than no AEB, but don’t rely on it entirely or you’re going to end up maimed or dead. As the driver, you are still the final responsible party.


One of the things you learn in flying is not to force it. If conditions are not safe. Just forget about it. It almost seems the same type of judgement needs to be made for auto-pilot. If your auto-pilot acts up at all. Just turn it off and don't use it till it's resolved. All you need is one incident to be dead, so if you get a chance to observe any abnormalities, consider it a blessing.

Something that will also be great will be a sort of "crash dump/bug report" button for these cars. If at any time your car does something unsafe, you can hit that button. The car will save the last 60 seconds so the manufacturer can analyze it to debug and figure out what went wrong.

I was so excited about auto-pilot and dreamed of getting in my car and sleeping while it made cross-country trips. So much for that, that seems way far out.


You've always been at risk of other road users, but that is well understood when you get in a car and drive somewhere.

What's new, and on top of that risk, and not well understood is the risk due to drivers mis-using a half-baked feature named "autopilot" as if it's actually an autopilot, because Musk insists that it works when it clearly does not always work.


I know too much about computer and software to trust any driving auto pilot. Heck, even normal adaptive cruise control makes me very nervous.

Are you talking about on the highway or roads with stoplights. Normally, I rarely have issues when using Auto Pilot on the highway. On all other roads though, it definitely isn't worth it.

I can't believe this person kept using it. If I had noticed a bug in auto-pilot and complained about it, I would be way too scared to ever use auto-pilot again. Personally, I never use auto-pilot because driving is piss easy, as it's designed to be.

Perfect self-driving cars is a nearly impossible feat to accomplish in an unbounded track. I can only imagine automated driving in a system which has no room for error. Examples include: tunnels under the ground, chain links on the ground (as in trolleys, trains, etc.), or anything else that vastly reduces the entropy involved in driving.

With self-driving cars on current roads, it will probably take years to get from 1% error to .1% error, and decades to get from .1% error to .01% error, which isn't even good enough. Perhaps it will take a century or longer to develop the required artificial intelligence to make self-driving cars perfect "enough". There's just too much room for unique problems to spawn. Bounding vehicle freedom seems to be the only way forward.


After reading about this for a while, just now, I'm much less comfortable with the idea of trusting an automotive autopilot system. Just imagine if your taxi driver told you "I will drive perfectly safely, except if there is a truck parked in the road. Then I will run into it at speed." It seems like such a glaring omission. If I were an engineer I don't think I would be comfortable releasing an autopilot system with this kind of safety issue.

You continue using this feature despite these errors occurring daily? The way you describe it makes autopilot seem like an inexperienced teenage driver; I think whatever marginal safety benefit you get from it might be erased by the life-shortening stress such erratic car behavior must induce.

Absolutely, times where the road markings are a mess can make Autopilot go crazy where it can't decide which 'lane' to stay in, I've also experienced the whole veering toward the concrete barrier. Each time I was able to anticipate that Autopilot would have trouble with what was coming up and took over when it did, you just have to apply a bit of force to the steering wheel and Autopilot makes an auditory and visual alert and disengages. I usually reengage after the danger has passed.

I have a Model 3 and use Autopilot on a daily basis. I think the key to using it properly is understanding its limitations. I always keep an eye on the road and traffic conditions and anticipate if Autopilot can handle a situation or not so I can be ready take over. I would say it can handle 70-80% of driving without issue, it's the last 20-30% I keep an eye out for when I am using it.

It's made me a better driver, I'm no longer a speed demon or care if people cut me off or get in front of me in traffic. It's like being chauffeured, I am less emotionally involved with the driving process.


I saw a friend 'drive' with autopilot, and it was scary. He was basically dozing at 80 MPH. Made me rethink my aspiration to have such a vehicle.

My friend has one and uses the auto-pilot responsibly, works very well. Until someone shows me concrete evidence that autopilot is responsible for an inordinate amount of accidents at a rate higher than normal human driving,...this seems like a huge waste of time and fear mongering.

I know these kinds of situations are frightening, but as long as AutoPilot is significantly safer, statistics-wise, over similar driving, it's hard to argue otherwise.

The autopilot may reduce driver attentiveness, making the edge cases in its automation more deadly than they might've been without it.

> Anyone who has driven with autopilot, how quickly might it react to a perceived obstacle? Would it take a hard turn into a median faster than a person could reasonably react if it thought there was something in the road or that the road took a hard left?

The only thing my car has ever abruptly/unexpectedly done was brake, usually in response to an overpass it mistook as a stopped car. It's happened maybe 10-20 times in 10,000 miles. Not enough to cause a rear end collision or even an angry honk. I've always had time to take over.

It's never taken a hard turn. The only times it catastrophically failed were when I knowingly put it in confusing situations: winding roads, poorly marked lanes, city driving, etc. In those cases I always have a death grip on the wheel, and the software seems to literally loosen its hold when it's feeling uncertain.

next

Legal | privacy