Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Another option: Tesla FSD is just driver assist, and most people know that. Yet from time to time, people get into accidents either because they overestimate the abilities of FSD or because they don't react quickly enough when FSD puts them into an unsafe situation.


sort by: page size:

I'm pretty sure Tesla's with FSD get in less accidents when compared to the average driver. Mine tends to be frustratingly overly cautious in most cases.

My college friend was fatally wounded in a Tesla. From time to time I go on Tesla's forum and was always surprised to see discussions there heavily leaned toward "Pilot Error" on any Tesla self-driving related accident. Like in what zones or some specific type of roads you should not enable FSD, if driver failed to remember then it's pilot error. I'm always confused about the *F* in FSD.

There are all kinds of spots around the country where FSD will make mistakes with a high likelihood. Tesla has already released stats that plainly state that while autopilot or FSD are engaged, accidents are way less likely to happen than if they weren’t.

It’s part of the contract for beta users that you have to pay full attention to the road ahead and be ready to react at all times if the system makes a mistake. In the case of the clip in OP, they deliberately didn’t react and should be kicked off the program by Tesla and fined by the police.


The expectation is that FSD should not do stupid things that a normal alert human driver would not do like

1. drive straight into a concrete barrier https://youtu.be/Tr-oF7J0cBw?t=13

2. drive straight under a tractor trailer https://www.youtube.com/watch?v=9BgV-YnHZeE

In this instance

3. Come to a complete halt in a tunnel with busy high speed traffic for no apparent reason.

And the excuse that the driver should have been ready to take over within fractions of a second of the Tesla making a mistake while in FSD is absurd expectations on human reflexes and attention.


FSD requires you to be 100% in control of the vehicle for the entire drive, and paying full attention to the driving, and also do nothing.

This is demonstrably not a task that anyone can do, let alone Joa[n] Average car driver. Highly trained pilots are not expected to do that, and that's when autopilot is being used in a vehicle that can provide significant amounts of time to handle the autopilot going wrong - seriously, when autopilots go wrong in aircraft at cruising altitude pilots can have 10s of seconds, or even minutes, to handle whatever has gone wrong, Tesla's FSD provides people a couple of seconds prior to impact.

That said in countries other than the US people can reliably use trains and buses, which also means that they don't have to intervene in driving the vehicle.


If we're going to be pedantic, FSD stands for Full Self Driving, which isn't available in Teslas yet. Full Self Driving Beta, which requires drivers to agree that they will fully supervise the car, is what is available right now.

Help me understand something though, and this is an honest question, because this seems to come up a lot... Are you saying that because it's called FSD, the driver in these situations should be absolved of responsibility for the crash (assuming FSD was involved in this)?


I think of it in terms of I am still piloting the vehicle, but FSD is executing the maneuvers. This means that I am able to focus on the bigger picture (looking further forward, paying attention to what is going on in all lanes, etc).

I am a MUCH safer driver with autopilot/FSD than without.

So, to your devil's advocate, I am sure I would blame myself if my family member died while driving my car no matter what the car was. But FSD in my car has prevented accidents for me and my family.

Edited to add: I don't want to pretend it is without risks. My biggest fear with FSD/autopilot is that it requires a constant understanding of who is in control. And that is the thing I stress to my friends/family. Your #1 job, and the thing that can kill you, is thinking that autopilot is engaged when it isn't and running a red light or driving off the road on an interstate exit.

Tesla gave me a loaner for a couple days while my car was in the shop. The loaner did not have FSD, and there were a couple of times I expected it to stop and it didn't. It was mentally tough b/c it was familiar like my car, but it wasn't my car.


Tesla’s system does more driver assistance than other level 2 systems, but it is a level 2 system itself.

FSD is an egregious misnomer. It doesn’t take over driving responsibilities under any circumstance. “Full automation” under SAE J3016 is level 5.

The dangerous part is the false impressions of capability that they give to some of their drivers.


In SF, Waymo/Cruise/etc have safety drivers behind the wheel who are being paid to ensure the car doesn't crash. Tesla drivers, on the other hand, are often misled about the capabilities of "FSD" and may not be paying as close attention.

Yet they allow human beings on public roads killing unwilling participants. There seems to be a two tier system at play - unaided humans killing thousands upon thousands to the extent it’s normal vs humans using a tool incorrectly and killing … a handful?

My experience with FSD is it’s a terrible autonomous system and anyone who uses it as such is a fool, and a fool with a car is dangerous no matter what. However the joint probability of my driving awareness and skill and the cars combined is greater than mine alone. I’ve had it suddenly brake when a car I didn’t notice was drifting into my lane and had it not I would have been in an accident. Likewise it made mistakes and I took control.

I personally don’t care if it ever is able to take me from point A to point B without my attention or assistance. I value its ability to navigate with my assistance especially on long trips, reducing my overall fatigue and taking me through confusing sections of urban interstate without errors - when I always make a wrong turn. The fact it’s 360 aware and I’m not and it’s indefatigable and I’m not is valuable.

In the last year it’s become remarkably more capable. I don’t know if they can continue this rate of improvement but if they can it’s about as good as I would expect from todays technology on a consumer car. That’s a decent bar for me. I think it’s also something valuable on the roads - as a driver assistance tool. The folks who turn it on and get in the backseat would do something just as bone headed without FSD. Rather I notice enormous number of Tesla cars on the road not being drive by total idiots, and presumably quite a few using FSD without issue. And, as I assert above, I believe the joint probability of the aware driver with FSD having an accident is lower than either alone.

I don’t care what hyperbole a bipolar nut job spouts, but I do appreciate him setting an unreasonable goal and failing halfway there while the rest of the world seems content with stagnating. Tesla created the EV movement in the mainstream, SpaceX created the space revival we are experiencing.

Fwiw, I think the choking on billionaire boots comment is not a particularly high value contribution.


What you’re missing are other people’s experiences. I completely believe your experience. I also agree FSD is fucking amazing compared to anything we’ve seen before.

However, driving in Houston, my Tesla would seemingly get in an accident nearly every single time I drive if I didn’t do something to correct. And at minimum it will do something stupid every single trip that causes a missed turn or exist that adds 5-10 minutes of driving. I would every single encounter it has around pedestrians on streets (no sidewalks in many parts of Houston) would have it wayyyy too close and going wayyyy to fast and make the pedestrian think I’m a huge careless asshole. There’s tons of other anecdotes, like it almost drive one of the wheels into a ditch (there’s tons of them in Houston). I would literally be hitting someone or something every single week if I blindly trusted FSD, and would an hour of pointless drive time every week too


This is no better than cruise control. I can even argue that it's worse. With cruise control, you know that you have to keep your hands on the road, period. But with FSD, you begin to trust it because most of the time it can work without trouble. It's the unexpected fails that are dangerous, even deadly, as you are traveling at 60+ mph and you are not ready to act.

Tesla needs to release safety data to better understand how useful it is. The fact that it has not, speaks volumes as to its safety and usefulness.


It's not necessarily worse, since there is a person driving the car who can prevent the car from behaving badly. What's the safety difference between this and a regular cruise control, which will happily plow you into a wall or car if you don't intervene?

And, empirically, there's no evidence that these cars are less safe when driving this way. Tesla claims people driving with FSD enabled have 4x fewer accidents than those driving without it, and nobody has presented any data that disputes that.


The problem is that humans are bad at sitting there paying attention while they're not actively engaged, which is what's required to use FSD safely. Also, FSD has essentially no benefits if you use it safely. If you listen to someone talk about what they want FSD for, they'll probably tell you they want to be paying attention to something other than driving or that they want to sit in a more comfortable position where their feet aren't on the pedals and/or their hands aren't on the steering wheel.

The current Tesla FSD beta has no purpose other than as a tech demo, but people absolutely treat it as if it's ready to be your chauffeur while you take a nap in the back seat.


Well, there's your problem right there. Assuming that Tesla's "FSD" mode is actually self-driving, even with an alert driver, is a big and sometimes fatal mistake.

Just another complaint that Tesla's marketing has little relation to reality. Who knew?


FSD is really good -- but as with human drivers, it's not perfect (though much better than humans on average). If you pin the failure of a driver to oversee FSD on Tesla, Tesla will just be forced to cancel FSD. I'm OK with the bargain where they make FSD available subject to the understanding that I need to remain mindful of the system at all times, and that if I fail it's on me, not Tesla. The alternative, as we've seen with other manufacturers, is that they just won't have FSD. They'll call it that, but they'll make you keep your hand on the wheel and your eyes on the road -- which isn't FSD at all.

I genuinely fear that the US's culture of safety-ism, which informed much of the COVID response, will totally preclude development of awesome technologies that have very safe, but not perfect, records.


That cars are capable of crashing was known before FSD was even a thing. Statistically, you should be much more afraid of the cars driven by human drivers, because they can do everything FSD can, and there's truly not many on the road using FSD.

I don't believe one accident is too many. I made my statement based on videos I've seen of people having to disengage their beta FSD in circumstances where a human driver would have no trouble.

Now, maybe the data says otherwise. If that is the case, then great! Let's role out some more FSD beta. But for that data to be valid, you have to account for the fact that Tesla filters bad drivers out of the pool of FSD users. And as I understand it there is not public data about the risk profiles of the people Tesla lets use FSD.


How would the FSD beta program kill me?

I realize that not everybody is in agreement, but I personally use the FSD beta while remaining fully in control of the vehicle. I steer with it, I watch the road conditions, I check my blind spots when it is changing lanes, I hit the accelerator if it is slowing unexpectedly, I hit the brakes if it is not...

You know, basically behaving exactly as the terms you have to agree to in order to use the FSD beta say you are going to behave.

When I look at the wreck in the tunnel (in San Francisco?) a few months ago, my first thought is: how did the driver allow that car to come to a full stop in that situation? Seriously, you are on a highway and your car pulls over half way out of the lane and gradually slows to a complete stop. Even if you were on your phone, you'd feel the unexpected deceleration, a quick glance would show that there was no obstruction, and the car further slowed to a complete stop.

FSD is terrible in many situations, that is absolutely true. But, knowing the limitations, it can also provide some great advantages. In heavy interstate traffic, for example, I'll usually enable it and then tap the signal to have it do a lane change: I'll check my blind spots and mirrors, look for closing traffic behind me, but it's very nice to have the car double checking that nobody is in my blind spot. There are many situations where, knowing the limitations, the Tesla system can help.

next

Legal | privacy