Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Just like Tesla blames dead drivers for using "autopilot." "They should have kept their hands on the wheel and been paying attention." No you can't have a copy of the data.


sort by: page size:

Like how they blamed the "gore" death on the driver for not having his hands on the wheel?

The data showed that the "hands on the wheel" alert preceded the crash by 8 minutes.

Tesla will absolutely use any data you send them to attack you, especially if you have the bad taste to die in their car with autopilot active.


Tesla always blame the "driver" (and publicly so, nonetheless) when they are killed by their "autopilot" (which is actually driving, that is fulfilling the purpose indicated by its name, despite being not very good at it...) -- either directly, or by carefully stating misleading statements that are both technically correct and incredibly dishonest in the context of both the events and the surrounding text.

Why people are still using their shit is beyond me. Why they are still authorized to sell it under this marketing (like the name given to it, or the mismatch between the restriction and the practice), I also can't understand: it should be better regulated by the authorities.


It also lets them blame the driver for their failure. Tesla is more than happy to share the cars data logs to “prove” it wasn’t the fault of Tesla... remember though, according to Tesla marketing the driver is only there because the lawyers say so....

A more nuanced dissection of whose fault it was or could be[0].

"Huang was apparently fooled many times by Autopilot. In fact, he reportedly experienced the exact same system failure that led to his fatal crash at the same location on at least seven occasions." ...........

"Huang knew that Autopilot could not be relied on in the circumstances where he was commuting along the 101 freeway in Mountainview, California. Yet he persisted in both using the system and ignoring the alerts that the system apparently gave him to put his hands on the wheel and take full control." ..........

"Elon Musk and Tesla should be held to account for the way they have rolled out and promoted Autopilot. But users like Walter Huang are probably not the poster children for this accountability."

[0]https://www.forbes.com/sites/samabuelsamid/2019/05/01/the-pr...


Tesla does not agree with the on obvious then. Drivers don't have access to all their data ; Tesla releases data when convenient for them, like to show the driver was at fault.

So, it’s worth pointing out that Tesla already has a history of using telemetry against you, publicly, if you have the misfortune to die while using autopilot.

Sure, I can acknowledge that the software should have reacted here, but so should've the driver. The driver is responsible for the vehicle and Tesla are clear about this. I don't think it's valid to blame inattentiveness on Autopilot either.

When autopilot stuff hit the news years ago, tesla would immediately shift blame and publicize the telemetry output of the person in question's car.

I don't think they should have said anything at all in this case, beyond their condolences.

1. Autopilot didn't cause the accident, but they're sort of making it sound like it did by bringing it up

2. If people are arguing that autopilot contributed by making the driver feel like they didn't need to pay as much attention, the only way to defend against that is to blame the driver - why would you do that?

3. It's sort of invading the privacy of someone who recently died by detailing the crash, making it about Tesla. I don't believe other car companies would issue a PR release if one of their cars got in a crash, even with accident avoidance features on?

It somehow manages to feel self serving and make Tesla look bad at the same time.


Has tesla ever taken the blame for any incidence caused by it's autopilot?

Drivers fault 100%.

But Tesla is not blameless with their marketing "Autopilot", "Full self driving", blah blah, giving people a false sense of security. I can't think of a worse problem to try and solve with AI. GPT hallucinates and gives a wrong fact. No biggie, but annoying. Tesla FSD hallucinates and runs over a child, a biggie.


Now compare this with Tesla's PR statement which makes it sound as though they have proof that the guy who was fatally steered into a barrier by their Autopilot had taken his hands off the wheel. They resort to this kind of sleazy, dishonest misrepresentation every time their software screws up.

That article does not make it clear if there was actually no human in the driver's seat. It says two men died, and it says there was a person in the front passenger seat and a person in the back seat. That implies, by omission, that there was no person in the driver's seat. Was there a third person who did not die, a person sitting in the driver's seat?

Obviously if there was no human in the driver's seat, these guys were "looking for trouble". And failing to negotiate a curve at an excessively high speed isn't something the autopilot should be expected to deal with (unless it has precise location data and precise map data which informs the reasonable limits of performance... but I don't think it has these).

It's absurd if people want to blame Tesla for this.


Meanwhile, Tesla is busy putting out press releases saying "We believe the driver was inattentive, and the car was warning him to put his hands on the wheel."

I am utterly, completely lacking in surprise that they didn't provide the relevant context, "... fifteen minutes prior."

This just looks... really bad for Tesla. It's more important to them to protect their poor software than drivers.


And Tesla will happily release misinformation too.

One of the last fatalities, Tesla was more than happy to push out a press release based on telemetry, saying "Autopilot wasn't at fault, the driver was inattentive - the vehicle even told him to put his hands on the steering wheel!".

They somehow neglected to mention that the steering wheel alert was triggered, ONCE, and FOURTEEN MINUTES before the crash.

Misinformation is not a good thing. But lets not pretend that Tesla is some downtrodden underdog just trying to make our lives better.

Also, if you have an accident in your Tesla, you'll have a lot of fun trying to get any telemetry information from them, even if Tesla isn't a named party and you're just dealing with the other involved person. You'll need multiple subpoenas and expect them to resist releasing any data as "proprietary".

But should your telemetry from an accident be able to be spun (correctly or otherwise) into a Get out of Jail Free card for Tesla, expect it to be released to the media without your consent or authorization (I'm sure it's buried in Section 48, Subsection 24, Paragraph 14c iii that you consent, but still).


Their statement blames:

-The driver

-The crushed barrier

-Statistics

But they do not take responsibility themselves. It's apparent and in my opinion shameful.

If Tesla's goal of this is to just share the facts, why didn't they state what AutoPilot did? In addition to statements like "...the driver’s hands were not detected on the wheel for six seconds prior to the collision.", wouldn't it make sense to have a statement like "AutoPilot seems to have made a grave error here, and steered the car into the barrier."

They never admit that their car didn't perform properly. This reads like a carefully crafted, legal approved statement - not the more human statements that we see many times after tragedies. This absolutely lowers my opinion of Tesla by a decent amount.

"We feel very bad about what happened... and want to take responsibility and do what's right" - A quote from the Walmart CEO after a Walmart truck accident injured Tracy Morgan and killed another man. I wish Tesla could stand up and say the same in this case (if the evidence points that way, which it appears to).

https://www.cbsnews.com/video/walmart-ceo-on-tracy-morgan-ca...


Oh, you assume Tesla won't fight to the death to avoid releasing data recorder information. They will.

Unless it "absolves" Tesla.

Remember, this is the company that when someone died, put out actual PRESS RELEASES to say "Not the car's fault. In fact it warned him before the collision that he was inattentive."

They neglected to mention it triggered ONE steering wheel warning... FOURTEEN MINUTES before the collision.

Even your example is problematic. "FSD/AP isn't at fault/didn't cause the collision/near miss, because it wasn't engaged..."

... because the driver had to take emergency action to avert FSD/AP's behavior.

They got taken to task for that, when investigatory boards started asking "just how long before that collision was FSD/AP disengaged, and was it disengaged by hard manual braking, etc.?"


This is a ridiculous response. The known fatality from Tesla's Autopilot was caused by an existing sensor problem that has no remedy, besides more sensors. The thing will essentially drive into overhanging white billboards and who knows what else. The fact that this isn't a wide disclosure is criminally reprehensible.

> how was the driver not aware of tha

Basically, blame anyone but Tesla, right?

next

Legal | privacy