Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

No but they actually should. And every company either already has or is near an equivalent, so it makes sense to train drivers on what will soon be ubiquitous car software. Remember the goal here is to make driving safer. And there’s no evidence Teslas System isn’t doing that. People crash and die without autopilot every single day.


sort by: page size:

They do.

Autopilot is standard in every single car Tesla sells now, for exactly this reason.

Here is the blog post on their website where they talk about making it standard: https://www.tesla.com/blog/update-our-vehicle-lineup


So what you mean is that Tesla should be using trained safety drivers like real AV companies do? Also, to be clear, you are either lying or misinformed about the behavior if autopilot as far as detecting engaged driving goes.

They do for legal reasons. Are you saying autopilot is no better than adaptive cruise control and lane assist found on many cars - Not the impression they want you to have at https://www.tesla.com/autopilot ...

Don't Tesla already have their autopilot tech released?

Tesla's autopilot system is no different than a self driving car with a safety driver and should be subjected to the same regulation. In a self driving car (Waymo, Cruise, etc) driver turns on autonomous mode when the driver thinks it's safe to do so. All drivers are thoroughly trained for the safety of themselves and other people on the road. Companies must obtain permits to have cars on the road and report annually on safety of the system. I'm not comfortable driving in the road knowing Tesla's drivers are not trained, the company is not subjected to the same laws as self driving system even though the drivers can turn on Autopilot ANYTIME they want, and the car can do WHATEVER.

Tesla’s autopilot is very comparable to autopilot in aircraft (read a bit before downvoting). It is something you engage to take the load off the vehicle operator but does not eliminate the need for the operator.

Autopilot (both in aircraft and in the Tesla) is really good at what it does. But if you start thinking about it as anything other than something which helps reduce the vehicle operators workload, it is dangerous.

Pilots understand this and get pretty extensive training. Some Tesla owners clearly do not. Some of the blame lays with Tesla due to marketing. Perhaps these hybrid machine/ human driving systems should require training.


Yes, but what's worrying is their response. Instead of taking responsibility for the fail-unsafe UI design and their failure to avoid obstacles above a certain height, they are completely blaming the user.

I can't see myself trusting their cruise control or autopilot features. It's like they're saying, "Hey, our cars can drive themselves! But not really. They can maintain speed and lanes and avoid other cars on the highway! But you can't take your hands off the wheel. You can relax and let the car steer for you! But you'd better be ready to take control in a millisecond if it beeps at you. Oh, and if anything bad happens, it's your fault for using it wrong. Thanks for participating in the Tesla Autopilot Early Access Beta Program!"

No thanks.


But we already let learner drivers try things out on the road, and their supervisors can’t even take control. Human drivers crash and make mistakes too… where is the uproar for allowing them to drive?

Where as Tesla only allows fully licensed drivers and until recently only pretty good ones at that.

I think you’re just regurgitating the standard anti-Tesla talking points… and avoided the questions on the fact Mercedes hasn’t tested this in actual streets and has just rolled it out and while it’s admitting liability isn’t that just as bad as what Tesla is doing? Worse in fact as there no bar for getting this feature, and it doesn’t get shut off for misuse (which FSD beta does).

Given all the approaches (with a hope of working) are using AI - how do you get the data to train the AI? This isn’t a Tesla problem, this is an AI training problem. Tesla already makes heavy use of synthetic data, and using real world human drivers to collect training material. How would you do it?


I don't see why Tesla should do this when plenty of other safety critical software (cruise control, autopilot from actual planes, etc.) doesn't do this.

I also think public review of this kind of code is likely to generate a whole lot more noise and PR bullshit than any actual actionable feedback.


Incorrect. Autopilot offers by far the most important training data: driver overrides. That is the dangerous part of the learning and Tesla makes its customers do it.

I already put my kids in a software driven car every day. It has automatic cruise control, automatic lane keeping, and automatic emergency braking. I don't trust Tesla's Autopilot yet but as soon as there is a self-driving car which I believe is safer than a car driven by me, I will switch to that.

Not true. Tesla has working advanced driver-assistance systems (ADAS) in the form of their "Autopilot" feature. This is as Level 2 (out of 5) on the autonomy scale. They are still very very far from Level 5. Much more so than Google.

Furthermore, they were getting their software from MobilEye, an Israel-based startup but cut ties with them last year and are developing their own Autopilot 2 program, which so far has not reached parity with MobilEye's Autopilot 1.

Still a long way to go and will be interesting to see whether they catch up to Google with all this crowdsourced data they collect.


Tesla is very clear about the limitations of autopilot, they require the driver to keep a hand on the wheel for a reason.

Would it be better if the car drove perfectly in all situations? Obviously yes. Are some people willing to pay for imperfect software that when paired with a human makes driving safer? Also yes.


Tell Tesla marketing that. They have autopilot already!

Sure, but the important question is the capability of the Autopilot, not the Tesla driver.

Well, it's going to depend on the feature. Slightly improved UI on the radio's touchscreen is a bit different from Tesla's autopilot.

In the case of autopilot, I'd say similar to a drug study, as it's something that can cause serious harm or death to the human subjects involved. Tesla should have an IRB for ethical monitoring, there should be a small trial group initially (and very gradual rollouts to larger groups), participants should receive informed consent at a Tesla store with emphasis on the risks and what's necessary to use the feature safely, and someone should have the authority to pull the plug on the study if it proves too dangerous.

Given the state autopilot is reportedly in, I'd say right now no one should be using it without an in-person training at a Tesla store to emphasize the proper usage of it.


"Effective autopilot assistance features can plausibly get us to 10X improvements in safety and convenience."

So similar to the autopilot feature that Tesla already has in their cars? Plus it isn't going to up safety until there is 100% acceptance in this autopilot feature. Sure your car might be driving safer, but the other people who don't have these features will still be a danger to you.


So what you are saying is that Tesla should require certification with practice sessions with an instructor and a written test for all users of Autopilot? Or do you mean that Tesla should assign a controller to each car to keep it 1km away from other cars?

This reply ignores the very significant fact that Tesla's autopilot, and any other with similar capabilities, both enables and invites the driver to not pay attention to the extent that Tesla itself insists is necessary for the safe operation of the vehicle. If actually conforming to what Tesla says is necessary is too much of an annoyance, it is not ready to be on the road.
next

Legal | privacy