Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

The remaining 5% are almost all situations where I take over because I’m underestimating what the car is capable of.

For instance: I don’t like how close it drives to the curb. I don’t drive close to the curb (and bias towards the center of a road) because I only have two eyes, and can’t really see the curb as I’m moving. I have an intuitive sense of how close I am, but that’s it.

The car knows exactly how close I am. So it driving in the center of a lame, while it makes me nervous because it’s now how I drive, is perfectly acceptable.

What I wish Tesla would do: set up some closed courses for people to come and experience what FSD is actually capable of. It would help the current FSD users gain confidence in the system, and it would show it off to people who haven’t ever experienced it.

I seriously don’t believe that most of the people who are critical of this thing have spent much time with it. I’ll admit after the first day or so I was skeptical too, in the same way I was skeptical of lane keeping and proximity aware cruise control.

But at this point: I hate driving without these things. They are an almost indescribable upgrade to the driving experience for me.



sort by: page size:

Another option: Tesla FSD is just driver assist, and most people know that. Yet from time to time, people get into accidents either because they overestimate the abilities of FSD or because they don't react quickly enough when FSD puts them into an unsafe situation.

I am not sure I agree with your assessment. Last time I was in a FSD Tesla, it repeatedly tried to run red lights and didn't know how to merge lanes on a freeway...

I find that Model 3 FSD hugs the center line even on roads with a huge amount of lane space, to the point that some cars and trucks will honk at me because they think I’m about to cross over. It also gets confused constantly by badly marked roads and can’t drive for more than a mile or two on complex streets without making some critical error or behaving like a drunk driver. It’s interesting to drive if you watch it like a drivers ed teacher teaching a vision-impaired student’s first lesson. It is not yet an assistive technology (outside of highways) that anyone should be delegating any amount of their attention to, and I wouldn’t want to be walking on a road with people using it that way.

What you’re missing are other people’s experiences. I completely believe your experience. I also agree FSD is fucking amazing compared to anything we’ve seen before.

However, driving in Houston, my Tesla would seemingly get in an accident nearly every single time I drive if I didn’t do something to correct. And at minimum it will do something stupid every single trip that causes a missed turn or exist that adds 5-10 minutes of driving. I would every single encounter it has around pedestrians on streets (no sidewalks in many parts of Houston) would have it wayyyy too close and going wayyyy to fast and make the pedestrian think I’m a huge careless asshole. There’s tons of other anecdotes, like it almost drive one of the wheels into a ditch (there’s tons of them in Houston). I would literally be hitting someone or something every single week if I blindly trusted FSD, and would an hour of pointless drive time every week too


Same, I was quite happy with the adaptive cruise control on my Tesla Model 3. I decided to try enabling FSD beta, and it's basically a disaster. I've experienced it:

* Get confused about its lane position and swerve around in the lane

* Attempt to get into the left lane despite a right turn coming up in a few seconds

* Try to change lanes into a lane that doesn't exist, off the side of the road

Once a month or two, I decide to give it another fair chance, and almost every time, it seems to want to do something stupid and/or dangerous.


The expectation is that FSD should not do stupid things that a normal alert human driver would not do like

1. drive straight into a concrete barrier https://youtu.be/Tr-oF7J0cBw?t=13

2. drive straight under a tractor trailer https://www.youtube.com/watch?v=9BgV-YnHZeE

In this instance

3. Come to a complete halt in a tunnel with busy high speed traffic for no apparent reason.

And the excuse that the driver should have been ready to take over within fractions of a second of the Tesla making a mistake while in FSD is absurd expectations on human reflexes and attention.


A lane assist that navigates complex junctions in cities, gets you from A to B, etc. It's a bit more than lane assist at this point. The main criticism on FSD is not that it isn't capable of doing these things. Because it obviously is at this point as demoed in countless videos where people try it out. Instead it's that it's not doing these things safely enough that it would be able to do so unattended. That's a valid criticism. It's mostly fine but when it isn't, it's a problem. Either way, removing the requirement to have your hands on the wheel seems to be not doable as long as that's true. And that's of course the promise that has yet to be delivered.

Worth noting that this is yet another anti-tesla article from Jalopnik. They've spread clickbait and misinformation on this topic before. I wouldn't take anything they say as established fact without independent confirmation. Authors there play very loose with the facts.


I bought FSD for my Model 3 and my mind is not blown. Amusingly, I prefer the known behavior/bugs of the pre-beta to the totally bananas mistakes that the beta makes by trying to be smart. And that’s a bad state for Tesla to be in, because other manufacturers are catching up.

Automatic lanekeeping on the highway and stoplight stop/go were groundbreaking when Tesla launched them, and these are the features that I get the most value from. The rest of FSD as currently demonstrated is parlor tricks.


tesla bad clickbait.

of course FSD does crazy shit right now; it's beta and it's learning! That's why you agree to not become complacent and have full control over steering at all times.

in my experience, it works well 95% of the time, but the 5% unpredictability factor is way too high, which is why I don't use it with passengers in the car unless asked to do so. more specifically, I might manually maneuver the car less than 2 miles out of a 20 mile trip with FSD beta enabled, which is an improvement over the 5 miles I would manually maneuver with FSD stable. but it's getting better, and it's definitely better than anything else you can use on public roads, so I'm sticking with it.


Around 285,000 people in the United States have FSD beta in their cars right now. The FSD take rate (FSD sold with a new Tesla) is around 19-20%. YouTube is full of videos showing FSD in action.

The software isn't perfect, nor is it finished. Drivers are required to keep their hands on the wheel and be ready to take over at any moment, and there is a monitoring system to enforce it. You can pick it apart and poke at the issues all you want, but Tesla is obviously far ahead of all of their competitors if you look at it from the perspective of a computer person rather than a car guy. The competition is still using LiDAR and pre-programmed maps which can't deal with real-world conditions, which is where Tesla was back 6-7 years ago.

I hope we can talk about this in the future to see how my prediction did. I'm @realtaraharris on Twitter (different handle because I began transitioning in October 2022).


Tesla's FSD beta should be thought of like you are a driving instructor/parent for a beginning driver on a driving permit but you have a full duplicate set of controls, like an airplane. Let the car drive but at any moment you have to be ready to take control or something bad might happen. We have millions of people training other people to drive every day and they can't even take over right away when there is a problem. People running FSD beta are training Tesla's driving AI. They should take the task very seriously, but I think its great they are allowed to do so. If it ever works well it will be great for safety and convenience.

Once FSD seems ready for level 4 or 5 automation, then I'm sure governments will have Tesla prove all sorts of things about how it functions and how good/safe it is before it is approved.


There are exactly 0 miles of FSD driven by the public without constant attention from a driver keeping their hand on the wheel. There are disengagements of FSD in virtually every trip, where the driver feels that they need to intervene or the car would do something wrong. The safety record of FSD is basically due to the human drivers, not the AI. Tesla themselves don't have any confidence that the car can drive without constant human attention.

This is in contrast to Waymo and Mercedes, where the car actually drives itself, using both vision and other sensors, and the driver can actually disengage.


If paying full attention to driving is so tiresome to you please don't drive. My impression of this description is that FSD is getting good enough to lull drivers into not paying enough attention. There shouldn't be a product like that either Tesla takes responsibility like Waymo or makes a clearly L2 system that is good at limited things. Blurring the line like this is going to get even more dangerous as FSD gets better

FSD is really good -- but as with human drivers, it's not perfect (though much better than humans on average). If you pin the failure of a driver to oversee FSD on Tesla, Tesla will just be forced to cancel FSD. I'm OK with the bargain where they make FSD available subject to the understanding that I need to remain mindful of the system at all times, and that if I fail it's on me, not Tesla. The alternative, as we've seen with other manufacturers, is that they just won't have FSD. They'll call it that, but they'll make you keep your hand on the wheel and your eyes on the road -- which isn't FSD at all.

I genuinely fear that the US's culture of safety-ism, which informed much of the COVID response, will totally preclude development of awesome technologies that have very safe, but not perfect, records.


There's a lot of unwarranted drama around this. Reposting a blip from what I wrote below:

>>FSD controls speed, steering, braking, obstacle avoidance, traffic maneuvering, parking, navigation in a manner that, yes, I would call "Full Self Driving." It is very far from perfect, and Tesla has certainly communicated it as being better than it really is. there's a high dose of caveat emptor, but this FSD does a lot more than any other car I've ever driven or owned. The fact that I will not let it attempt to navigate certain intersections, either out of embarrassment or risk of getting hit, does not diminish the fact that my car will give it the ol' college try.

I think it's reasonable, certainly not baffling, to call its performance "Full Self Driving." As noted below, I also think Tesla should be obligated to a generous refund/trial program, which to me absolves the ticky tackiness of "what the definition of FSD is"


The problem for me is it feels like the market (and Tesla to an extent) has jumped the gun.

Tesla hasn't demonstrated enough self-driving progress (and imo, no one has yet) to justify the car's design to me, nor being a company with a 54 billion dollar market cap.

It's one thing if FSD was just on the horizon, but there are an incredible number of very hard problems to solve before we get there. Yet Tesla is designing a car that requires FSD for justification of it's interior and charging for FSD as a feature.


I own a model 3 with FSD. One of the key realizations I had after a few months with it is that self-driving at the moment is like supervising a newbie teenager driving. You have to stay alert and watchful, but also don’t have the direct control. When you are teaching a new driver it’s worth it.

Honestly, it’s a lot less work to just do it yourself.

This is one of those engineering situations where the 0.0001% edge cases matter, and could lead to fatalities. I don’t know if any of the implementations are up to the mark. FSD isn’t.


The problem is that humans are bad at sitting there paying attention while they're not actively engaged, which is what's required to use FSD safely. Also, FSD has essentially no benefits if you use it safely. If you listen to someone talk about what they want FSD for, they'll probably tell you they want to be paying attention to something other than driving or that they want to sit in a more comfortable position where their feet aren't on the pedals and/or their hands aren't on the steering wheel.

The current Tesla FSD beta has no purpose other than as a tech demo, but people absolutely treat it as if it's ready to be your chauffeur while you take a nap in the back seat.


The bottom line is that FSD as sold is nowhere near ready for prime time, still. As in, you always have to supervise it.

Who's wants to be their cars supervisor?

For me, it's either fully drive and enjoy your driving ... or full FSD and relax ... if it's somewhere in between, per Tesla, it's pointless (and don't get me wrong, lane assist and cruise control are handy, but they've been around for eons).

next

Legal | privacy