Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login
Tesla Could Be Fined Daily over Autopilot’s Nag-Free ‘Elon Mode’ (www.thedrive.com) similar stories update story
39 points by sschueller | karma 21616 | avg karma 4.57 2023-08-30 14:12:34 | hide | past | favorite | 30 comments



view as:

NHTS is asking Tesla to dutifully report all cases when illegal function was enabled, or FSD failed, or if car systems misbehaving could contribute to a crash. But can they trust Tesla to provide accurate data? Same with accident investigation - Tesla likely has much more data about events leading to an accident than investigators, bit would they provide all that data if it would suggest Tesla's responsibility for the accident? Looks like we have no choice but to trust Tesla, but should they be trusted in such matters?

Not providing all data requested would turn a civil matter into a criminal matter. "False statement to a Federal Investigator", etc.

I'm not sure how much of a deterrent that actually is. Musk hasn't seemed to be bothered too much by violating the law when he felt it served him before.

He did end up owning Twitter one time he tried it.

Musk aside, there's a strong incentive to withhold information that would be self-incriminating for Tesla (or any other company in such situation). Similar situation with accidents - wouldn't the information mysteriously disappear if it suggested car manufacturers fault? Easy to claim that data was unrecoverable in such case, or that some parts are missing..

Hasn't Tesla falsified several things, such as goodwill repairs vs warrant repairs and I'm sure countless others?

> or if car systems misbehaving could contribute to a crash

As worded, isn't this every disengagement, since not disengaging could have, maybe, resulted in a crash?


Ford's Blue Cruise lets you take your hands off the steering wheel. It only works on highway sections that have been pre-evaluated, but that's now almost the entire US Interstate system. Mercedes Drive Pilot also allows hands-off operation, but only at lower speeds. It's more for traffic jams.

Tesla doesn't have that level of technology.


Generally the systems that allow this have much more advanced driver attention tracking. Usually using infrared projectors and cameras aiming directly at the driver's face.

Tesla is attempting to add visual driver monitoring using the interior security camera near the rear view mirror, but it doesn't have the same level of fidelity and they still require both camera attention and steering wheel torque for normal customers using their assist systems.


> rejected a camera-based monitoring system for Tesla vehicles for being "ineffective,"

This is a strange take while otherwise claiming cameras are all you need for full safe driving. Is driver monitoring so difficult or is this just Musk's indictment of Tesla's engineering?


I would assume they'd consider it an ineffective use of effort unless required. Any monitoring system would cost to implement and very likely would reduce sales if it's not entirely customer controlled.

It's mostly bullshit. Other technologies like radar, LIDAR (UV or IR), and FLIR can provide datapoints unavailable in the visual spectrum. Seeing through fog, dust, and darkness are beyond human capabilities that would add significantly to safety. Increasing processing speed and gain in the visual spectrum would also add to safety.

Also, I'm curious what current FSD cameras do around sunset. Are they blinded? Can they vary aperture?


I think the discussion above is about using cameras to monitor the driver as a means to replace the torque test needed on the wheel, not technologies for outside the car.

> what current FSD cameras do around sunset

They have impressive dynamic range, but can be blinded. I wonder how much of this is related to camera window cleanliness though.

https://www.reddit.com/r/teslamotors/comments/i5usdv/fsd_low...

https://www.reddit.com/r/teslamotors/comments/5zxnoj/autopil...


Yes, and are there warnings about this?

Perhaps areas with cameras should have special coatings and/or dedicated wiper devices.


> Yes, and are there warnings about this?

Yes, of course. Autopilot will disable itself.

Here's how race cars do it. A reel of plastic film, to always allow a clean view. A broken camera, showing the reels: https://imgur.com/a/Hzxg7pt


Yep. And boats and CNC machines do with with circular spinny things that use centrifugal force instead of a physical cleaning element.

Also, I'm curious what current FSD cameras do around sunset. Are they blinded?

Yes, if you drive on the 105 Freeway in LA in the Hawthorne area (heading east in the morning or west in the evenings, into the sun) you'll notice a lot of Teslas driving erratically, like teenagers learning to brake for the first time. It's not so noticeable in rush hour traffic, but it's very noticeable when traffic isn't heavy. My coworkers that drive Teslas no longer use AP or FSD as they are terrified of it.


Just slap a big disclaimer on entering "Elon Mode" that no further warnings about attention will be displayed. Surely people have driven vehicles with cruise control before there were "Don't nap to work in the back seat like an asshole idiot" nags.

> Just slap a big disclaimer on entering "Elon Mode"

It's not accessible or supported. Where would this disclaimed be shown?


The mode they're asking Tesla to remove isn't included in production cars. It's a development flag found by an outside hacker. This is giving me bad flashbacks to the Grand Theft Auto Hot Coffee incident where Rockstar got their game rating changed and fines issued for something not accessible in the game


A development flag...that was found in a production car. That it requires developer access to enable is irrelevant, since it was clearly enabled in a production car traversing public streets in the video Musk shared this past weekend.

Given that the whole point of the video was to demo pre-production software, it obviously wasn't a production car. It might have been production hardware, but if it's running pre-production software, it's not a production car.

I'm generally happy with Autopilot in my Tesla but it annoys me that I have to keep tugging at the wheel. Is it not safe to sit with your hands in your lap, ready to grab the wheel, while attentively monitoring traffic?

It feels like a hypothetical requirement on old-school cruise control to keep your foot lightly on the brake, in case you need to stop. But obviously no cars worked like that. You just set cruise control, kept your foot comfortably on the floor next to the pedals, and kept an eye out.


Definitely don't do this, but you could, hypothetically, jam a half full water bottle into the steering wheel. I supposed that the movement of the water sloshing would be enough to indicate that a hand is on the wheel.

For context,

'Tesla hacker discovers secret ‘Elon Mode’ for hands-free Full Self-Driving'

https://news.ycombinator.com/item?id=36412170


Context: 'Tesla hacker discovers secret ‘Elon Mode’ for hands-free Full Self-Driving'

https://news.ycombinator.com/item?id=36412170


[flagged]

Legal | privacy