Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

> it can also end your life or another driver's or pedestrian's life; slamming the brakes on is how a tesla model s caused an 8-car pileup on the bay bridge a couple of months ago

Come one... I thought I was on a scientifically-minded website. That's not how things work. It's like saying seatbelts are bad because they sometimes trap people in burning cars. Yeah, that's true, but it happens orders of magnitude less than crashing and splattering your brains across the highway if you aren't wearing one. That's the point x)



sort by: page size:

> People will die because of this.

And not just drivers of Teslas.


> TESLA CRASH ON SF’S BAY BRIDGE

> resulting in an eight-vehicle crash

I'm sorry, but this is an 8 vehicle pile up from people following too close (or paying too little attention). At least where I'm from if you run into the back of a car it's your fault. It means you followed too close, failed to react to a situation in front of you.

Should the car have stopped there, no?

But is it that car's fault that others ran into the back of it? No.


> In a pileup like this it's basically never the fault of the front car, unless maybe if they are purposely causing the accident for insurance fraud or something.

Brake checking (what the Tesla did) does definitely make the front car the guilty party. It's usually done for insurance fraud, here presumably done just by AI gone mad. But same result and same guilt.


> Once again, I am speaking generally and not about this specific accident. We have no idea how any other person or car would perform in this exact situation and I don't see much value in speculating.

But we have an NHTSA complaint documenting 11 other accidents of this form. That is, a stopped car in front of a Tesla that was provably on Autopilot. And then the Tesla crashes into the stopped car (most commonly: emergency vehicles like Firetrucks or Police cars).

And who knows about all the other situations where a crash was going to happen but the driver took emergency action and saved the day.

-----

This accident, the one in the twitter post, as well as the 11 other documented cases in the NHTSA investigation, prove that there's a systemic problem with the technology.

There are probably other accidents too where the emergency braking should have worked but aren't documented yet.

------

This is no longer a Rusell's Teapot scenario. We've got 12-examples (11 from the NHTSA complaint, and +1 from this Twitter thread / Twitter video). We've got the examples, live on camera. We can absolutely say without a doubt that Teslas have difficulty in this case.


> There were two impacts somehow. It was a 4-car pileup and my car was destroyed.

Likely explanation for two impacts during a rear-end pileup:

1. Tesla rear-ends your car.

2. Another car rear-ends the Tesla, propelling it forward and hitting your car a second time.


> great marketing for Tesla

Yeah, until someone dies from this sort of reckless driving.


> At some point a vehicle is going to do something atrocious

Multiple people have already died[0]

[0] https://fortune.com/2022/10/19/tesla-cars-involved-in-10-of-...


> It doesn't even stop after hitting the jet. It just keeps going right through it.

I thought that was the worst part of it too, assuming theres nobody inside the car at the wheel. It keeps going, stops, then moves again and looks to go around whatever obstacle it thinks exists. What if this situation can be replicated on an empty road where the tesla runs over a human being? In the worst case, does the tesla keep trying to drive over the human being and turns a minor injury into a fatality?


> After that he had a range of little annoying problems, many of which were likely software bugs, but then he (actually his GF) experienced something in autopilot mode known on the Tesla forums as "shadow braking" -- the car braking hard on the highway because it seems to see the shadow of an overpass and think it's a wall. This is extremely dangerous. It was on the I-405 (SoCal) going 70-80mph, and such a thing could easily cause a rear end collision.

This is a known issue. It isn't a software bug. It is autopilot learning that the overpass isn't a tractor trailer or other large object the camera can't discriminate crossing the path of the vehicle. [1] It was introduced after this fatality [2].

With that said, if that causes concern for the driver, the vehicle isn't for them.

[1] https://www.tesla.com/blog/upgrading-autopilot-seeing-world-...

[2] https://electrek.co/2016/07/01/understanding-fatal-tesla-acc...


> Basically if you hit a car from behind, it’s your fault, every time.

Incorrect. If you brake check someone (what the Tesla did), the front car is at fault.


> Granted the accident would legally be the meat driven car's fault who bumps into the suddenly stopped Tesla.

Exactly. If you bump into the car in front, because they stopped for a percieved hazard, it is your fault for tailgaiting, not their fault for percieving a hazard.


>As noted in the article, the information being released includes blaming the victim and other PR spin. This doesn't serve the public or further highway safety; it further's Tesla's commercial goals.

Saying clearly that abdicating your responsibility to drive the car can (and has) resulted in deaths does serve public safety. Its critically important that people driving this section of road (and others like it) know about this as soon as possible.

Call it victim blaming if you want, and certainly there was some spin, but it is unfair to say that this has no safety element.


> The car did eventually detect the obstacle and brake prior to impact

The driver is claiming he braked, not the car: "Huang says he slammed on the brakes once he noticed the truck, however, it was too late to stop the nearly two-ton sedan traveling at reportedly 68 miles per hour."

https://www.thedrive.com/news/33789/autopilot-blamed-for-tes...


> clearly the cause of the crash here isn't the tesla, it is other cars not respecting minimum safety distances

Try to park on the highway and claim the people who crash are at fault and see how it plays in a court of law


> , it's pretty clear the driver was at fault

It's pretty clear both the driver is primarily at fault and Tesla's poorly designed system (as it existed at the time) compounded the driver error.

> is slowing down on a high speed road any less dangerous

Yes, slowing down gradually and pulling over is safer than careening down the road with zero input the human driver which as you admit "needs to be in the loop".


> The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision.

This would be hilarious if it wasn't so tragic. So did Tesla's autopilot!


> So, while not ideal, some of the accident blame is on 7 cars following too closely.

According to the police report, the Tesla - its driver or its FSD feature - made "an unsafe lane change" immediately prior to slamming on the brakes.


> It seems the local government or highway agency also neglected their duty to maintain the highway safety barrier, a shockingly regular occurrence where I live as well. I’ve wondered how often someone is injured because they failed to repair a barrier for several months.

While the crash attenuators should exist and the various responsible authorities should maintain them appropriately, I find it frustrating that this is brought up in this conversation as if it's a significant factor. It might have saved this man's life, but this crash was sure to be incredibly violent with or without the barrier.

The existence of a crash attenuator could not and should not affect anyone's decision making that led to the car impacting the barrier. Not the driver, not Tesla, not autopilot.

I hope the NTSB comments on this and it leads to Caltrans doing a better job of replacing these quickly (if they haven't already committed to this in the aftermath of this incident), but I also hope that it has zero bearing on the rest of the report.


> the man died because of that issue

Tesla would argue that the proximate cause was that the driver was not paying attention.

> And now Tesla is saying that there is an issue, and that it's his fault for using the system.

Tesla is not saying that it's his fault for using the system. They are saying it is not Tesla's fault if the driver wasn't paying attention, despite being presented with repeated warnings closely preceding the impact.

next

Legal | privacy