Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

> At some point there really needs to be a legal discussion about whether Tesla's argument that the driver must always be alert and responsible for the car is a realistic standard in which to hold people when semi-autonomous systems are engaged.

It seems completely predictable that it wouldn't be, especially when they call it "autopilot."



sort by: page size:

> With Tesla you are constantly paying attention, or should be to the same level that you are with any other car.

you should be paying attention, but you are not. Human attention is a difficult thing. from what I recall, in a driving it takes about 7-14 seconds to re-gain situational awareness. This means that 1 second isn't enough.

> Like "a Prius driver drove off a bridge and is now landing on our hood." Of course the system will disable itself in that situation;

I'd expect that the system would slam on the brakes, not disengage to avoid liability. Thats the point here, its not about tech, its about legality. Thats the worse part, the entire system appears to be designed to stop tesla being taken to court.


> The level-2 driving that Tesla is pushing seems like a worst case scenario to me

What are you measuring? The current autopilot already appears to be materially safer, in certain circumstances, than human drivers [1]. It seems probable Level 2 systems will be better still.

A refrain I hear, and used to believe, is that machine accidents will cause public uproar in a way human-mediated accidents don't. Yet Tesla's autopilot accidents have produced no such reaction. Perhaps assumptions around public perceptions of technology need revisiting.

> Neither the driver nor the car manufacturer will have clear responsibility when there is an accident

This is not how courts work. The specific circumstances will be considered. Given the novelty of the situation, courts and prosecutors will likely pay extra attention to every detail.

[1] https://www.bloomberg.com/news/articles/2017-01-19/tesla-s-a...


>Could they just disable or scale back the autopilot?

The problem with this is that Tesla clearly believes that autopilot plus an attentive driver is safer than an attentive driver alone. I think that is most likely true.

It is possible that autopilot plus an inattentive driver is less safe than an attentive driver alone. I think this is plausible, but I don't think there is any real evidence to prove this one way or another.

The question then becomes does Tesla have an obligation to save the people in the second group from themselves and in turn put the people in the first group in greater danger?


> I'm not opposed to Tesla's sale of such functionality, sell whatever you want, but I am opposed to the marketing material selling this in a way that contradicts the legal language required to protect Tesla...

First let me state that I agree with this 110%!

I'm not sure if this is what you are getting at but I'm seeing a difference between the engineers exact definition of what the system is, what it does, and how it can be properly marketed to convey that in the most accurate way. I'm also seeing the marketing team saying whatever they can, within their legal limits (I imagine), in order to attract potential customers to this state-of-the-art system and technology within an already state-of-the-art automobile.

If we are both at the same time taking these two statements verbatim than which one wins out:

> Autopilot is not a fully-autonomous driving system. It's a tool to help attentive drivers avoid accidents that might have otherwise occurred. Just as with autopilots in aviation, while the tool does reduce workload, it's critical to always stay attentive. The car cannot drive itself. It can help, but you have to do your job.

and

> The system is designed to be able to conduct short and long distance trips with no action required by the person in the driver’s seat.

If that's the crux of the issue that goes to court then who wins? The engineering, legal, marketing department, or do they all lose because the continuous system warnings that Autopilot requires attentive driving were ignored and a person who already knew and complained of the limits of that system decided to forego all qualms about it and fully trust in it this time around?

I feel like when I was first reading and discussing this topic I was way more in tune with the human aspect of the situation and story. I still feel a little peeved at myself for starting to evolve the way I'm thinking about this ordeal in a less human and more practical way.

If we allow innovation to be distinguished for reasons such as these will we ever see major growth in new technology sectors? That might be a little overblown but does the fact that Tesla's additions to safety and standards thus having a markedly lower accident and auto death rate mean nothing in context?

If Tesla is doing a generally good job and bringing up the averages on all sorts of safety standards while sprinting headlong towards even more marked improvements are we suddenly supposed to forget everything we know about automobiles and auto accidents / deaths while examining individual cases?

Each human life is important. This man's death was not needed and I'm sure nobody at Tesla, or anywhere for that matter, is anything besides torn up about having some hand in it. While profit is definitely a motive I think the means to get to the profit they seek Tesla knows they have to create a superior product and that includes superior features and superior safety standards. If Tesla is meeting and beating most of those goals and we have a situation such as this why do I feel (and I could be way wrong here) that Tesla is being examined as if they are an auto manufacturer with a history of lemons, deadly flipped car accidents, persistent problems, irate customers, or anything of the like in this situation?

For whatever reason it kind of reminds me of criminal vs. civil court cases. Criminal it's upon the State or Prosecution to prove their case. In the civil case the burden is on the Defense to prove their innocence. For some reason I feel like Tesla is in a criminal case but having to act like it's a civil case where if they don't prove themselves they will lose out big.

To me it feels like the proof is there. The data is there. The facts are known. The fact that every Tesla driver using Autopilot in that precise location doesn't suffer the same fate points toward something else going on but the driver's actions also don't seem to match up with what is known about him and the story being presented on the other side. It's really a hairy situation and I feel like it warrants all sorts of tip toeing around but I also have the feeling that allowing that "feeling" aspect to dictate the arguments for either side of this case are just working backwards.

And for what it's worth I don't own a Tesla, I've never thought about purchasing one. I like the idea, my brother's friend has one and it's neat to zoom around in but I'm just trying to look at this objectively from all sides without pissing too many people off. Sorry if I did that to you, it wasn't my intent.


>perhaps something else should be done that will incentivize the driver to disengage Autopilot.

There's a very simple solution: Don't fucking include it in the car. Require the users to show up at the car dealership and sign a legal waiver to allow the activation of the new, insecure, and dangerous feature.

There are no cars on the road with airbags that sometimes work and sometimes cause 3rd degree burns. There are no seatbelts that only work if the road isn't shiny.

If you want to advertise the autopilot as a safety feature available to the general public, make sure it is one. If it's not, and you advertise it as such, you're legally liable for all the deaths your lie will cause.

Personally, I'm technically knowledgable enough and I know enough about Tesla's history to be wary of their products and never trust them, but what if you're the average consumer? You would rightly expect any car automation system that comes pre-installed in your car and actually drives your god damn car to be safe enough to be there in the first place! Tesla is just taking advantage of the lack of legislation and standards on autonomous driving. They're selling a defective and dangerous product. They should be held liable for it.


> Tesla needs to take responsibility for negligently releasing a feature they knew was going to be abused.

I can release a car without any autonomous features, should I take responsibility for the people that I know will eventually use it in an inappropriate manner, such as while intoxicated, sleep deprived, on medication, or just without paying attention?

> Its shocking to me how they still have this feature enabled when it has proven to kill.

Driving cars has proven to kill, even when you follow all the directions correctly. Some things can't be avoided.

Nothing about this is any different than cruise control. In both cases you are supposed to still be paying attention and take over in the case of situations the assist feature can't handle. Cruise control just handles less.

This will be a problem until we make the cars completely autonomous, with enough safety assurance. Until then, if we don't want to allow assistance features this advanced, we need to take legal action not against Tesla, but to create laws that are very explicit in what is and is not allowed.


> Have you actually tried it or did you just read it off some anti-tesla rant on the Internet?

I have.

> They are failing to implement these features that Tesla's already have in great quality on the mass-market

> Go actually test a Tesla autopilot, on a long drive

You keep buying into Tesla's marketing.

Here's a riddle for you: every time a Tesla fails and kills a person, Tesla quickly blames the driver. They say that the driver has to always keep watch and grab the wheel at the first sign of trouble.

Here's an official statement by Tesla in such an accident: "We also ask our customers to exercise safe behavior when using our vehicles, including following the car’s instructions for remaining alert and present when using Autopilot and to be prepared to take control at all times" [1]

Why is that, do you think?

Other companies are just less careless about human lives. They don't pretend it's a full autopilot. They don't pretend that the system can handle 100% of situations. They don't pretend that you can safely fall asleep behind the wheel, and nothing will happen. They call it what it is and introduce the features gradually because they are fully aware of the current capabilities of autonomous systems.

Tesla is fully aware of that as well, but their marketing is willfully engaged in deception tactics.

> What part of it is "unrealistic" when it already exists?

The part where it doesn't exist. In all cases it requires the driver to be fully engaged and alert at all times.

[1] https://abcnews.go.com/US/teslas-autopilot-blamed-driver-acc...


> At some point there really needs to be a legal discussion about whether Tesla's argument that the driver must always be alert and responsible for the car is a realistic standard

This should have been done before these things were ever allowed to put on roads.

Zuck is right. We naively trust the world at large to work for our benefit automatically. We truly are "dumb fucks".


> maybe Tesla gets a license that allows using Autopilot, but only on highways with a speed limit equal to or lower than 55 MPH, and only when it isn't raining or snowing.

This is in line with my thinking of the manufacturer taking responsibility. If they can guarantee that it is safe only in certain conditions they can enable self-driving only in those certain conditions. If they can guarantee that it is safe everywhere, they can enable it everywhere.

> It's really unappealing to sell a product where you're liable for what end-users do with it.

Yes, this is a serious problem that would need to be addressed before implementing a system like this. It would make third-party repairs and modifications difficult. (They would probably have to take responsibility or disable the self-driving features. Of course maybe that is the right model, the repair shop takes responsibility but if they show the problem wasn't their fault then they can turn around and demand damages from the original manufacturer. Unfortunately now there is a lot of annoying legal work generated.)

> On the tame side, what if they don't update it?

This would be the manufacturer to decide. Maybe it refuses to self-drive if not updated. Of course the reasons that the feature would be removed would need to be made clear to the consumer (or they could get a refund).

> I just wanted to bring up that that's probably the death knell for getting to own a self-driving car. Maybe it's still worthwhile.

I don't think it is the death-knell, but it definitely blurs the line further between who owns the car and will likely slow down the adoption of individual self-driving cars. This does have downsides but to me it looks like these are likely better than letting this dangerous combination of insufficiently "smart" vehicle and distracted driver zoom around the roads.


>How do you propose any self driving system reach perfection without ever having a beta testing period?

A few thoughts here: (1) I didn't propose otherwise, I just pointed out that it's deficient (at least, legally) to say "I can be negligent because others are negligent", or to put it another way "other people's negligence should put you on warning with regards to my negligence", (2) I don't think it's my burden to satisfy this problem, regardless, even granting your point, it doesn't have to proceed the way Tesla has so far, it certainly hasn't with regards to other similar companies making similar products.

>And why not give drivers responsibility for maintaining safety?

Because the manufacturer of a self-driving product should be primarily responsible for how it functions. How exactly is that controversial to you? To whatever extent the driver is jointly liable might be a question, but, given how Tesla's self driving feature works, Tesla is clearly responsible for how it functions, including when it determines that the driver must take over. I've seen nothing that indicates that this function occurs in a way that always allows a driver to establish a safe response to whatever the self-driving couldn't handle. So, to be honest, your question seems a bit absurd to me.

>Shouldn’t they try to be safe, and carry responsibility when they are fully in control?

Yeah, but no one here is talking about when the driver is driving, we are talking abotu when the self-driving feature is driving. You are farcically insisting that the driver is driving even when self-driving is enabled. Can't have it both ways.


> he shouldn’t have been using autopilot on that known problematic strip of road.

Tesla don't make preconditions on where to use their autopilot. How many Tesla's discover a new 'problematic' strip of road each day? Is every first Tesla to drive down a new road to become a crash-test dummy when it turns out the autopilot just can't handle that next bend in the road?

It's absurd to think so. The product is not fit for purpose. I'm confident the law would agree.


> ... treated as fact by people and media

Sure it is a fact that Elon Musk predicts they will have robotaxis in a year. It is not however a fact that Elon Musk advises the drivers to drive carelessly.

> How can it do this but __also__ "require active driver supervision"? This statement implies ....

Yes it does. In a parking lot. At parking lot speed. While you are not in the car.

> and it __definitly__ implies the vehicle is autonomous.

If by autonomous you mean it can automatically come to you in a parking lot then yes. If by autonomous you mean you can sleep behind the wheel while it is driving in the streets then it __definitly__ does not imply that.

> But these are conflicting statements...

No they are not. One is about operation in a parking lot. One is about future functionality/value of the product. And one is about the current functionality of the product.

> people think that Tesla is fully autonomous and thus get into accidents like this

Citation needed. I'd say it is more likely that they are not using the car as intended, not that they mistakenly think their car is fully autonomous.

> ... covering their ass

Not sure what your point is here. So they are damned if they do damned if they don't?

> but if people are still doing this then Tesla isn't doing a good job of making this clear

People don't always use things as designed. This is not exclusive to Tesla cars.

> They simply hide it inside the legal documents

Not true. It is on the first page when you search for Tesla FSD. And I am pretty sure the warning is shown to the users at different steps before AP can be activated.


> but we need to compare those against the cases where autopilot prevents accidents that are unlikely to be avoided by humans

Right, but do any examples of this exist, ever?

Your link is clearly not an example. Any aware driver would've done the same.

I'm sceptical there can ever be a scenario were tesla autopilot can outdo an aware, conscious driver.


> What is this evidence?

I think the onus should be on Tesla to prove that their testing and validation methodology is sufficient. Until and unless they have done so, Autopilot should be completely disabled.

I really don't get why the regulatory environment is so behind here. None of these driver assistance technologies (from any manufacturer, not just Tesla) should be by default legal to put in a car.


>They have a good solution to the "expecting the driver to watch and take over" problem

I've always thought the model itself represents a fundamental design flaw. These systems should either be fully autonomous and responsible, or engage only as active safety measures in accident avoidance.

Tesla has been particularly ambiguous and irresponsible in its messaging/positioning. Naming the product "Autopilot" and really playing up its capabilities, then disclaiming responsibility in the event that the system fails.


>Let me put it very clearly for you: it is not acceptable, in any shape or manner, to expect your users to have to "save themselves" from your product killing them every 5 seconds. If your product requires that, it is a bad product.

Yes it is. Otherwise you're asking all car manufacturers to stop selling cars. "I have to constantly keep my car between the lines or I'll cause an accident" is an acceptable standard. That's how every car on the road works today. The liability is on the user to operate it properly.

You accuse the poster of fanboy-ism, but you don't realize how much Tesla tells the driver that they need to pay attention and keep their hands on the steering wheel while they're using autopilot. So maybe your lack of experience with the car makes you pull those baseless arguments.

While I agree Tesla is deceptive with their wording and the way they sell their autopilot ("have the hardware needed for full self-driving capability"), it's not legally wrong. They have the hardware for it (according to them), but at the moment, the software they have is a glorified lane assist. They specifically mention in that same page:

    > Every driver is responsible for remaining alert and active when using Autopilot, and must be prepared to take action at any time.

>The problem is that Tesla's taken systems intended as driving assists and branded them using terms like "Autopilot" that imply that a human doesn't need to be in the loop.

Even in aviation, 'autopilots' don't mean the pilot can leave the cockpit unmanned. Human pilots are needed at all times in case the autopilot disengages due to unexpected conditions.


> In both of these scenarios, until truly driverless cars are validated and approved by regulators, drivers are responsible for and must remain in control of their car at all times.

This makes it sound like it's just a problem with evil legislators not making it legal for Tesla drivers to drive without paying attention.

Is this the case?


> I agree with you, but Tesla are very clear that the person in the driver's seat is 100% responsible for the vehicle when Autopilot is active.

So when Tesla says[1]:

> The person in the driver’s seat is only there for legal reasons. He is not doing anything. The car is driving itself.

The "legal reason" for having a person in the driver seat is so that they can act as a liability shield for Tesla?

[1] https://www.tesla.com/videos/autopilot-self-driving-hardware...

next

Legal | privacy