The headline alone should show, what is odd with this case.
* Tesla clearly states, that the autopilot is a driving aid and should not be used unobserved.
* The engineer complained about the autopilot not working perfectly at the very place of crash. (Quite understandeable when seeing the state of the road and mostly absence of line markings in videos of the crash site).
* Evidence seems to show that prior to the crash he did not pay attention while driving on autopilot at this very part of the road.
So I really don't get why anyone whould trust their life to a driving assist by not paying attention and closely supervising the system.
Where I think the lawsuit should have a merit and a good chance of succeeding is the state of road and the maintenance by Caltrans. They are in my eyes liable for the deadly outcome of the crash and that there was a crash at all:
* Left exits by themselves are a safety issue and should not exist (extremely rare here in Europe).
* The line markings were almost non-existant. Potentially dangerous sections of roads should have very good line-markings.
* As a consequence, it seems to be common that cars crash into this divider. That there was a "reusable" crash protection at it, should show the dangers of this section clearly.
* The crash protection was not active which lead to the deadly nature of the crash, as a car had crashed into it a week before. No other protections were in place (traffic cones, barrels)
So this section is shown as being very dangerous as human drivers regularly crash into the divider, there are no traffic cones (or have been destroyed by people almost crashing into the divider) and it was not secured after the last crash. The exit and the left lane should have been closed for traffic as long as the crash bareer is not in place.
The lesson here is that Tesla does not take complaints seriously and someone died as a result. It sounds just as likely to me that another person using autopilot would have had the same issue at that location.
This isn't great news for Tesla. This 30 second video also recently surfaced where someone reproduced the fatal accident involving autopilot from this week:
This could also be bad for Tesla if this is what occurred. Interestingly in this video the chevron lines exist, something people have been claiming would have averted the accident, also interesting that no warning alarm sounded at any stage.
Tesla probably shouldn't be saying anything about this at all, even just to avoid giving it more news cycles. But if they were going to say something, here's what they should have said the first time.
----
We take great care in building our cars to save lives. Forty thousands Americans die on the roads each year. That's a statistic. But even a single death of a Tesla driver or passenger is a tragedy. This has affected everyone on our team deeply, and our hearts go out to the family and friends of Walter Huang.
We've recovered data that indicates Autopilot was engaged at the time of the accident. The vehicle drove straight into the barrier. In the five seconds leading up to the crash, neither Autopilot nor the driver took any evasive action.
Our engineers are investigating why the car failed to detect or avoid the obstacle. Any lessons we can take from this tragedy will be deployed across our entire fleet of vehicles. Saving other lives is the best we can hope to take away from an event like this.
In that same spirit, we would like to remind all Tesla drivers that Autopilot is not a fully-autonomous driving system. It's a tool to help attentive drivers avoid accidents that might have otherwise occurred. Just as with autopilots in aviation, while the tool does reduce workload, it's critical to always stay attentive. The car cannot drive itself. It can help, but you have to do your job.
We do realize, however, that a system like Autopilot can lure people into a false sense of security. That's one reason we are hard at work on the problem of fully autonomous driving. It will take a few years, but we look forward to some day making accidents like this a part of history.
Short version: due to poor lane markings, Autopilot made the same mistake as many humans in the same situation and collided with the divider. Due to the frequency of this kind of accident, the crash attenuator had been collapsed and not reset meaning the Tesla hit the concrete divider at full speed, as has happened in the past with humans in control.
But please continue to blame Autopilot for not being smarter than the human operating the vehicle.
Because we don't know if it did it. All we know is that a Tesla was involved with a fatal crash involving a person changing their tire on the Long Island Expressway. [1]
There's been no confirmation whether Autopilot was enabled.
I couldn't find any other details about the accident, but it seems from the description like it was a side collision at an intersection.
In order for that to happen, it's likely that one of the vehicles had to violate a stop sign or red light.
Note that the Autopilot isn't aware of these things (it won't stop unless a car in front stops). If the Tesla was the vehicle at fault, then most likely it was unawareness of this limitation on the part of the driver (or complacency thinking there wouldn't be any traffic).
Well I found the comment of tesla, odd as well.
but it is also odd that the driver still kept driving the same road WITH autopilot activated, besides he KNEW it was broken.
It's a strange accident, especially because of the strangeness of tesla and the whole background story (broken safety barrier).
I'm looking forward to the outcome of the whole story.
At the end I guess more than one party could've been a bad actor. Tesla, The guy itself and even that the road had no kind of warning (safety barrier not replaced or some kind of other warning sinces that it will be replaced soonish)
Is this the same one from the other day? As I understood it there wasn't any talk of autopilot being involved, beyond the obvious involvement of a Tesla? e.g. at the present time is there anything beyond brand that distinguishes this crash from someone crashing a ford or some such?
That particular spot had been crashed into before and the crash buffers hadn't been properly replaced. This led to a normal crash turning into a fatality.
The driver of the tesla had complained in the past of the tesla swerving into the wall when passing that spot, yet still had his guard down on the day he died.
That the tesla swerved is down to the poor striping on the road. The stripe along the right side of the left fork was brand new and very sharp, while the stripe along the left side of the right fork was extremely dim to non existant due to excessive wear sustained during construction of the left fork and the bridge it leads to (connection from the fast lane of 101s to the fast lane of 85s).
The crash barrier has since been properly replaced and the striping redone with diagonal lines hashing off the gore so it's abundantly clear even to a camera not to drive there.
That said, the tesla self driving made a very dumb error no human driver would make.
> [...] where a Tesla crashed into a concrete barrier[1] [...]
Of course it should never have happened. Still, I'm impressed by how gracefully it handled the accident; emergency lights, keeping the lane and slowly braking the vehicle to a halt (again, such frontal collision was still unacceptable).
I honestly worried about that when writing the comment. The car should have avoided it (based on Tesla’s usual claims).
But Tesla and everyone else says the owner is still ultimately responsible because AutoPilot isn’t a 100% situation.
If my car did something funky like randomly accelerate or turn hard at a given intersection and the manufacturer refused to fix it I’d stop driving through there.
Why push your luck?
I truly don’t like blaming the victim. I’ve been very critical of Tesla lately for their claims and safety issues.
But I don’t understand this man’s decision at all. It doesn’t seem reasonable.
I wish I knew why he kept using the system in that area.
The ONLY idea I can think of is when the car got an update he would try again to see if it was fixed and of the tries was sadly the last.
I'm not convinced we should entirely blame Tesla, given that humans have been crashing into the same barrier:
> In the three years before the Tesla crash, the device was struck at least five times, including one crash that resulted in fatalities. A car struck it again on May 20, 2018, about two months after the Tesla crash, the NTSB said.
The article also says the engineer had complained about his Tesla veering towards this particular barrier. I don't understand why he still relied on the autopilot while driving past it.
Came here to say this. It's unfair to start talking about autopilot mishaps in this case, and what this family is saying, etc. When there is NO evidence yet as to whether autopilot was engaged.
Further more.. if the barrier was already hit previously by another car, as the reason why it was collapsed prior to the Tesla crash, this may simply be an area where the road is not designed very well and causes many human error in that location. This happens all the time. In fact, road design is a significant cause of vehicle crashes.
It was a left-hand exit (why are those even a thing?).
The seperator did not have contracting protectors because a car had previously crashed into them, I wonder why.
No one ever dies in car accidents.
Hence, Tesla is super bad for marketing its autopilot according to HN.
reply