This "driver aid" model itself is starting to sound like a problem to me. You either have safe, autonomous driving or you don't.
A model where a driver is assumed to disengage attention, etc but then be expected to rengage in a fraction of a second to respond to an anomalous event is fundamentally at its core flawed I think. It's like asking a human to drive and not drive at the same time. Most driving laws assume a driver should be alert and at the wheel; this is what...? Assuming you're not alert and at the wheel?
As you're pointing out, this leads to a convenient out legally for the manufacturer, who can just say "you weren't using it correctly."
I fail to see the point of autopilot at all if you're supposed to be able to correct it at any instant in real-world driving conditions.
I really don't like autopilot. It's good enough to make drivers trust it and not pay attention, but it's not good enough to not kill people when they do that. And when there's an accident Telsa come out and say "the system warned the driver to put their hands on the wheel" or something similar. Unless a car can 100% self drive drivers aids should require the driver to have hands on the wheel and be paying active attention at all times.
I think the main problem here seems to be that the driver has put too much trust into Autopilot and what she needs to do is rather assuming it is as just another driver assistance feature and always be in control of her vehicle at all times.
The problem isn't "you don't have pay attention to X" it is "you don't have to pay attention to anything during normal operation". It is human nature, if not outright physiology that it is incredibly difficult to keep someone engaged, and paying attention to a task that they are not actually doing.
The problem with halfway solutions like autopilot isn't that they control the car some of the time, it's that they can, essentially without warning Stop controlling it.
Self driving cars will happen. But the current model is essentially "do enough of the driving to maximize the likelihood that the human 'driver' is not going to pay attention, but don't do enough to make their lack of attention safe".
My statement isn't "cars allow you not to pay attention", it is "systems like autopilot actively encourage a lack of attention", that's backed up by the need for alert systems when they have obvious signs that the driver isn't paying attention, like "no longer holding the wheel". If the system didn't in general usage result in lack of attention then such a system would not be necessary.
That's some PR firm-level explaining. First, autopilot is not just lane assist, it's lane assist + adaptive cruise control + additional features (it can change lanes if you tell it to). Other OEMs have purposely shied away from enabling an autopilot-like feature set together, even though it's technically possible, because they are absolutely terrified of something exactly like this situation happening.
Second, you're completely ignoring that autopilot will very much entirely drive the car by itself for minutes at a time (reports are anywhere from 5 to 15 depending on software version), which will obviously encourage drivers to not pay attention to the road. I don't care about disclaimers, I don't care about nag screens or chimes, and I definitely don't care about some warranty text that flashes on the screen. If you make a car that can drive itself you encourage drivers to let it do just that and you should be prepared for that eventuality. Period.
At no point should the system have to "[tell] the driver to take over immediately", because the driver should always be in control. That is the point of the assist technology: to act as a backup for driver error. The driver is not acting as backup for the autopilot error.
Circumventing the attention monitoring systems isn’t the issue. Those only exist for the purpose of liability shifting, and serve no other purpose.
We _know_ from innumerable studies, and just direct experience in aviation that saying “this will be done automatically” but also requiring absolute attention is not something humans can do. The only people trying to claim to the contrary are the self driving car people, and only so that when their self driving system fails they can say “the person in the driver seat is the person actually driving and it’s their fault, not our faulty software”.
This is without considering their other failure mode: disabling without warning, often immediately prior to crashing. Autopilots in aircraft can do “something’s wrong I’m disengaging and warning the pilots” because any time the autopilot is engaged and the pilots are not actively engaged with flying the aircraft they’re at altitude and have significant amounts of time to react. The few events that do require immediate responses have extremely constrained responses: essentially push down or pull up (ground proximity, stall, or TCAS alerts) - none of which is the case for road vehicles.
The time on a road vehicle between something going wrong, and impact, is generally less 3 seconds (I recall Tesla’s giving 2 seconds of notice). In that time the now driver has to become aware of the change in state, develop situational awareness, and then start to react. Then the actual correct course of action has to actually complete, which also takes time.
IMO self driving system that hands off control of the vehicle less than 10s prior to an accident occurring that it is responsible for is the fault of the manufacturer. Obviously a self driving system can’t be held responsible for an accident caused by another vehicle.
If the manufacturer’s self driving system is unable to recover from whatever is going on, it’s reasonable to give up and offload to the human in the driver’s seat, but it not reasonable to then say that they were responsible for an accident if they are unable to recover.
Again, all these nonsenses trying to avoid being in charge of the car are simply symptoms of the actual problem which is that self driving cars today are unsafe. It’s obviously fairly awful of those people to knowingly circumvent safety systems of something that they are aware is already unsafe, but it’s the manufacturers fault for selling a product that is unsafe in the first place.
>>If the system detects that hands are not on, it provides visual and auditory alerts. This happened several times on Mr. Huang's drive that day.
If the system detects that hands are not on the wheel, perhaps something else should be done that will incentivize the driver to disengage Autopilot. Perhaps, send warnings alerts to law enforcement, insurance companies, or emergency contacts? Not sure what the fallback procedure is when Autopilot detects dangerous situations for extended periods of time - presumably disengaging completely would be worse, unless there's a safe way for Autopilot to pull over to the shoulder at the next available opportunity.
If you create a system that by design will cause the driver to lose focus, you are responsible for the driver losing focus.
Saying “we told the users that they must pay attention while they are not driving”, or “we told them that the system called ‘auto-pilot’ is not actually able to autopilot the car”, and “we advertised the car’s self driving ability but had documentation saying it doesn’t actually have that ability” then you are responsible for that system driver behaving exactly as you would predict.
Their have been numerous studies that show that people stop paying attention when they are given menial or non-focus dependent tasks. Saying that the only safe way to use autopilot is by not being a human is fundamentally the same as saying “the brakes in the car only work if you press them continuously for 5s or apply 300lbs of force”.
Driver assistance that takes care of everything (i.e. steering, braking, navigating, parking), that at the time the accident occurred didn't even require hands on the steering wheel?
Seems like an attempt to shift the blame off the autopilot system and onto the driver.
This is why many are worried about Level 3 autonomy vehicles -- that is, driver assist where the car can drive ~95% of the time, but the driver may need to intervene at any moment. So far all the studies I've seen indicate that the driver loses attentiveness whether they mean to or not, and the disengagement times are in multiple seconds.
Personally, I think a system needs to work well enough to drive itself the whole trip, or be dumb enough to act as cruise control with some crash prevention built in, but otherwise force the driver to be on task.
Uh, really? You don't think a human driver could have taken any corrective action if they were paying attention to the road? The risk I see with these semi-autonomous systems is precisely that they invite the sort of inattention seen here. The brakes, at least, should have been applied.
You make good points. Although, disengaging when you release the wheel sounds like it could do as much harm as good. If anything, I'd rather it discourage this behavior using a nag alarm, possibly followed by automatically slowing/stopping the vehicle somewhat safely. As you say, people are stupid, and I can see some idiot trying to grab a beer out of a cooler in the back seat within the grace period (or deal with a dropped cell phone, pebble in a shoe, spider on the window, etc) and ending up careening off the road or into oncoming traffic.
I agree that this feature isn't useless. I'd love to have it, myself. As you say, it does need work to idiot-proof it, and possibly a re-branding.
But still, I would much much rather see full autonomy, as soon as possible. Wouldn't want to see us stay bogged down in the halfway stage. And I'm a little concerned that the first fatal accident caused by inappropriate use of driver assistance systems like this will unrightfully poison minds against fully autonomous systems, and take us a huge step back.
>t's a tool to help attentive drivers avoid accidents that might have otherwise occurred.
This needs far more discussion. I just don't buy it. I don't believe that you can have a car engaged in auto-drive mode and remain attentive. I think our psychology won't allow it. When driving, I find that I must be engaged and on long trips I don't even enable cruise control because taking the accelerator input away from me is enough to cause my mind to wander. If I'm not in control of the accelerator and steering while simultaneously focused on threats including friendly officers attempting to remind me of the speed limit I space out fairly quickly. In observing how others drive, I don't think I'm alone. It's part of our nature. So then, how is it that you can have a car driving for you while simultaneously being attentive? I believe they are so mutually exclusive as to make it ridiculous to claim that such a thing is possible.
Telling people to remain alert will do no good because it is simply asking for too much. A person is supposed to sit still, not driving but maintaining their full attention while being ready to automatically take over in an emergency situation. This will lead to the same failure of vigilance over time we see in guard duty and it really can't be helped. Task unrelated thoughts crop up more frequently for simpler tasks; with few stimuli, low surprises and little to attend to, the brain will begin with experience replay (this is actually more a machine learning term but it's appropriate here).
Driving already leads to mind wandering states; it is overwhelmingly likely that the passive aspect of autopilot will lead to mind wandering at even higher rates. Asking a less practiced driver to shift attention from internal to external states very quickly and then make complex judgments is simply not fair. A simple physics and statistics based model will be far more reliable. If it isn't, then Google's strategy of shooting straight to level 4 makes sense.
> The public's perception and 'common sense' understanding of technology is sadly very limited
Common sense is contextual and one of the more complicated aspects of cognition; it depends on the level of detail in the model being used to make inferences. A model's sophistication is dictated by internal preferences and goals. If most people's understanding of a technology is limited, then they're going to be doing what looks like averaging over distinct possibilities to a more informed model.
It doesn't help that if you know nothing about technical uses of the term autopilot but do know something about words (which will be the case for most) then in truth, it is the aviation industry that has misnomered.
I agree, drivers that do not pay attention share the blame too, but is not fair to blame only the drivers, the "autopilot" requires that the human driver maintain 100% attention but is the system monitoring the driver properly, is it trying the best to keep the driver engaged to the road ? No it does not do that because is not comfortable for the customer, so safety is sacrificed.
This has a strong possibility of making things worse by conditioning the human driver to react with "oh, it's just the auto-pilot testing me", and thus not reacting properly when it does actually malfunction.
That's absolutely wrong. A driver without assist will be fully engaged, while with assist they'll be tempted to divert their attention. Once attention is diverted they're no better than autopilot on its own.
I have several times been a passenger in a car where I wished the human driver would enable autosteer. It is disconcerting to have the lane warning alarm go off five times in a 30 minute span.
I don't think most people realize how erraticly they drive.
But five warnings in 30 minutes could just be 15 or 20 seconds of inattention. So more than 99% full attention. A driver assist that just smooths things out only 0.1% of the time could save the day a lot.
There's a warning that makes it clear that you need to maintain your full attention on the road, when you enable Autopilot and every single time you take your hands off the wheel. If anything, giving users specific scenarios where Autopilot may fail would take away from that fact, and make it seem like there are other cases where it's ok to take your eyes off the road.
A model where a driver is assumed to disengage attention, etc but then be expected to rengage in a fraction of a second to respond to an anomalous event is fundamentally at its core flawed I think. It's like asking a human to drive and not drive at the same time. Most driving laws assume a driver should be alert and at the wheel; this is what...? Assuming you're not alert and at the wheel?
As you're pointing out, this leads to a convenient out legally for the manufacturer, who can just say "you weren't using it correctly."
I fail to see the point of autopilot at all if you're supposed to be able to correct it at any instant in real-world driving conditions.
reply