Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Hullo, another Model 3 driver here.

I mentioned this in another comment, but it's relevant here; OP wasn't using Autosteer or Autopilot, they were using cruise control: a miles more simple system--and that system behaved in a way that seriously reduced confidence in it.

(For the record, I've experienced exactly the same behavior in my M3 as well.)

So, if we can't trust something as simple as cruise control, and we've seen similar confidence-eroding issues in the autopilot features Tesla has already deemed "good enough" for a live release, I think it is 100% reasonable to be skeptical that Tesla is anywhere close to "solving self-driving."

They have a history of overblown claims and faulty software in this realm--as much as I adore my car and the disruption Tesla has done to the industry, Tesla does not deserve the benefit of the doubt in this realm, especially not with safety-critical systems.



sort by: page size:

I was an early-ish adopter of the Model 3 (2018), and I couldn't be happier with it. I've had zero hardware quality issues, and one software issue where it did a cold reboot one day and forgot my personalization settings. But otherwise it's an absolute pleasure.

However, big caveat: I absolutely do not use Autopilot, nor do I ever plan to. I bought the car because I love to drive it -- the way it handles, the way it accelerates, etc. The only "automatic" feature I've used is literally the cruise control, i.e. maintaining a set speed on the highway. That's the farthest I trust it to do anything automatically.

Being tangentially involved in the field of machine vision / machine learning, I understand the limitations of the current technology, and am appalled at how brazenly Tesla is rolling out this half-baked functionality to users, and marketing it as if full self-driving is just around the corner, when in fact it has a long, long way to go.


I love my Model 3, but I cannot overstate how little confidence I have that Tesla is anywhere near to full self driving.

Just this weekend I drove a 600 mile round trip. It was cold weather with some snow on the shoulders, but the actual road surfaces had no snow. I was mainly using just the cruise control, and no auto-steer. Even with just cruise, the car gets "spooked" from time to time and will suddenly jerk to a slower speed for absolutely no reason, even during the day and with no precipitation. At one point it did this several times within the span of 10 minutes, which was such an unpleasant experience that it made me want to stop using cruise entirely. Besides being jarring for the people in the car, it makes me worry that anyone tailgating will rear end me.

Besides that, the rain/sleet that was falling from time to time apparently blocked the camera, so at one point the car wouldn't even let me use cruise any more.

The car is tons of fun to drive and I don't regret the purchase at all. But I really don't like the overpromising on full self-driving.

EDIT: bmcahren pointed out to me that the new FSD beta software merges images from multiple angles, which my software does not do. I don't know much about that, but it sounds like the kind of change that could lead to a noticeably better result than what I experienced.


It’s not autopilot, it’s Autopilot. They’re making up their own branded feature called Autopilot which is nothing like what any reasonable person would consider to be autopilot.

If you go back through the years here, you’ll see increasing care they use with their language:

https://web.archive.org/web/*/Tesla.com/autopilot

Look at the snake oil they were peddling in 2016:

> All Tesla vehicles produced in our factory, including Model 3, have the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver.

Not only was this clearly false in 2016, it’s clearly false today, nearly 6 years later.


Yes, but what's worrying is their response. Instead of taking responsibility for the fail-unsafe UI design and their failure to avoid obstacles above a certain height, they are completely blaming the user.

I can't see myself trusting their cruise control or autopilot features. It's like they're saying, "Hey, our cars can drive themselves! But not really. They can maintain speed and lanes and avoid other cars on the highway! But you can't take your hands off the wheel. You can relax and let the car steer for you! But you'd better be ready to take control in a millisecond if it beeps at you. Oh, and if anything bad happens, it's your fault for using it wrong. Thanks for participating in the Tesla Autopilot Early Access Beta Program!"

No thanks.


autopilot drives like a teenager who just got their license. It makes extremely rookie mistakes and often behaves in unpredictable ways, which is extremely unsafe. It loves to hang out in blind spots and cannot anticipate the most obvious intentions of other drivers.

I really like traffic aware cruise control as one of the best adaptive cruise controls out there, but autopilot is a mess and I don't use it.

I haven't tried the full self driving beta since I wasn't about to give Tesla ten grand so the car can change its own lanes, but I can't imagine it's any better.


From Wired's "People Keep Confusing Their Teslas for Self-Driving Cars" (https://www.wired.com/story/tesla-autopilot-crash-dui/):

A spokesperson pointed out that the owner's manual reads, “Autosteer is not designed to, and will not, steer Model S around objects partially or completely in the driving lane”.

I haven't seen the encouragement to not pay attention. In some ad on TV, maybe it was for a new Cadillac, the driver opens a soda with both hands off the wheel, and that seemed to be as far as they were willing to go.


This is sad and tragic.

I remember going with a friend on a Tesla test drive. He was eager to try the Autopilot system. And within a few minutes we got freaked out with what seem to us as unpredictable action and we turned it off.

I think the engineer should have never relied on it. Especially if he had concerns with the system.

At the same time, "Autopilot" is misleading and Tesla does own responsibility here. Even if it's purely a marketing term it's designed to up sell you on a half-baked technology.

They should call it "Driver assist" or something like "cruise control plus".

It's a heartbreaking story.


The biggest mistake here is conflating Autosteer with Full Self Driving (FSD). This isn't necessarily the author's fault as Tesla uses the word Autopilot to mean Autosteer, but any Tesla owner should know this.

Either way, Autopilot and Full Self Driving are not one in the same and it's made pretty clear to the end user that autosteer is not for obeying street signs, etc. It's a simple lane-assist, cruise-control assist, and lane switcher when the turn signal is pressed (if you pay for the lesser of the upgrades).

That being said, Tesla's FSD is also not very reliable and in my own personal testing has aggressively "shot the gap" and flew over speed bumps. However, it at least is made to stop at stop signs and traffic signals and does so.


This autopilot problem is entirely user's fault. Tesla never promised complete driverless experience. It is users who have the idea of autopilot being a replacement for driver. It's not. Just because it was in public consciousness and imagination for decades, it doesn't make it real. People are imagining autopilot being different than it is.

Only "fault" from Tesla is not being more clear about what the autopilot is for.

Also what I would do, at least until they make it better, is to raise the threshold of uncertainty after which the autopilot stops working and human driver is required. Just stop the car and force the driver to take the wheel.


That's a staunch overreaction. You're correct in saying that you can equate Tesla's auto-steer and adaptive cruise control to the autopilot in an airplane, however what you fail to account for is that in an airplane operating on autopilot, the pilots are still required by law to be attentive and paying full attention to the system. This is very similar to how Tesla portrays current-generation of autopilot. Sure, their marketing team does state that all cars are capable of self-driving (i.e. they have the required hardware bar HW3 processor), but they never state that current generation cars are self-driving currently. It's on consumers to understand this difference and pay full attention while driving. My mom was a flight attendant on a major U.S. airline until a few years ago, and it wasn't uncommon at all to hear of a an autopilot system malfunctioning ever-so-slightly in that it steers slightly off-course due to a miscalibrated sensor but, since the pilots are actively paying attention, this error never puts the lives of those onboard the plane in-danger. I would say that Tesla's software should be held to the same standard, in that the operator is required and absolutely needs to pay full attention to the system, but there is no need for Tesla to "disabl[e] autopilot on every single Model S and Model 3 sold until the issue is found and fixed"

I think people just assume Tesla's Autopilot is more capable than it really is.

My car has adaptive cruise control and lane keep assist, but I'm not relying on either for anything more complex than sipping a drink while on the highway.


Autopilot is terrifying. I've been a huge admirer of Tesla for years. I've always wanted a Tesla. I test drove a Model 3 a few weeks ago and had a very negative experience. Auto-pilot/FSD are enabled in different ways (one pull of a stalk vs two). It shows up on the center screen as a small blue dot. There were times I wasn't sure if it was engaged, or if I needed to take control. Since the screen is to the right I had to take my eyes off of the road.

Part of the problem is me not trusting the car, but it's also because the system isn't very intuitive.


I've driven a Model 3 over the course of a few days, maybe a handful of hours in total, and based on that experience I absolutely do not feel comfortable using Auto Pilot and would not buy a Tesla at the moment.

It's far more janky and susceptible to confusion than Tesla makes it out to be in its marketing, and the reality is that people simply do not pay as much attention as they are required to when using it because Tesla has convinced them it's magic that's safer than anything else on the road.


> The autopilot doesn't drive the car, the driver does. The autopilot is a help just like cruise control. Tesla are very clear (apart from the naming of the feature) that the driver must still drive the car.

I'm sorry, but that's a cop-out. What's the point of an autopilot if you have to avoid every single obstacle and react to every single thing as if the auto-pilot didn't exist? You can't say "autopilot is safer than normal humans" (heavily implied by the blog post) and then turn around and say "but it's really the driver who's responsible for everything!" anytime there's an accident.

Unlike autopilot, cruise control has very limited functionality. No driver expects it to do anything other than keep their speed, so they are still actively engaged with the road to respond to anything that takes place.


Throw back to how Tesla promoted Autopilot as recently as 2019:

>Full Self-Driving Hardware on All Cars

>All Tesla vehicles produced in our factory, including Model 3, have the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver.

https://web.archive.org/web/20190118080103/Tesla.com/autopil...

Seemed very confident about it at the time...


I agree with you. So maybe the system could expect the driver to actually drive.

I've never driven a Tesla, but I've seen a similar thing in a Citroën I rented once. On the highway, with cruise control, the car could practically drive itself. It would follow the lane, slow down if cars in front got too close, accelerate back to the set speed when they went away, etc.

Once or twice I completely lifted the hands off the wheel, and pretty quickly it would start complaining. So with this system, the driver actually had to be engaged.


I own a Model 3 and love it. But you're 100% right: the "full self driving" claim Musk/Tesla has been making for years is bizarre. I live in a major city and have never had a trip using FSD where I didn't have to take control multiple times. At times, updates seem to make it get worse. Simple example, on a highway, it can't even keep the car in the middle of the lane through moderate curves. It almost always drifts to the outside, sometimes onto or over the lane marker. And I know it's doing this because I can watch it on the display, so the car clearly knows it's not centered. Surely this should be a trivial task (as self-driving tasks go).

You can blame the system when it misleads its users. To this day, the Model S configurator calls the system "Autopilot" and uses "Full Self-Driving Capability", "Navigate on Autopilot", and "Full Self-Driving Computer" as bullet points of things it can currently do.

https://i.imgur.com/3cQRMwZ.png https://www.tesla.com/models/design#overview

The fine print isn't a justification when you're saying as loud as possible that this is a magic machine that does everything. Describing it as such is misleading.

For comparison, here is how Cadillac describes Super Cruise, a comparably-rated driver assist suite:

"Super Cruise" drive assistance feature. That's it. That's all you get when picking out the car. It frames it as an assistance, not a replacement or autopilot. If you go into details, you get "A driver assistance feature that allows hands-free driving under compatible highway driving conditions"

https://www.cadillac.com/sedans/ct5/build-and-price/packages

If you search out their detailed marketing materials, the message is consistent:

"Hands off the wheel. Eyes on the road." "Adaptive cruise control". "Stay centered". "Lane change on demand". Note that none of these promise that the computer takes over everything. The closest they come is "the first true hands-free driving-assistance", and that word assistance is absolutely key in framing this as not a replacement for all driving.

https://www.cadillac.com/world-of-cadillac/innovation/super-...


The result of "autopilot" is entirely predictable. The NTSB, NHTSA, CSB, and pretty much every other safety board have repeatedly found that any system that requires someone (anyone) pay complete attention, while also not actually doing anything, will fail in the second part. It is wholly unreasonable to expect a person "in control" of a partially self driving car to have instantaneous response, it isn't even reasonable to expect 90% attention to what is going on around the car.

Add to that Tesla's insistence on intentionally mislabeling things - autopilot has a clear implication, especially given their demos of it, the "full self driving" mode is clearly outright false labeling - and I think Tesla does share liability.

The current model of "self driving" with the expectation the a driver will be able to immediately respond to anything the driving system misses, is inherently unsafe.

next

Legal | privacy