I have several times been a passenger in a car where I wished the human driver would enable autosteer. It is disconcerting to have the lane warning alarm go off five times in a 30 minute span.
I don't think most people realize how erraticly they drive.
But five warnings in 30 minutes could just be 15 or 20 seconds of inattention. So more than 99% full attention. A driver assist that just smooths things out only 0.1% of the time could save the day a lot.
I hate to put it this way, but I think it'll be easier to get people to use autopilot (that performs better than your average driver) than it will be to get people to get gud at driving.
You're an attentive driver right? You've probably noticed people lane changing without looking, causing or almost causing collisions. Or jockeying for position fruitlessly. Or being drunk or getting sleepy. Or getting aggro. Back when I was driving a lot I would have been grateful for anything that meant I wouldn't have to drive around people like that.
I dunno where I'm going with this exactly, I suppose we've had similar experiences and it makes me a little surprised that you'd be against automated systems bringing the skill floor up.
This is why many are worried about Level 3 autonomy vehicles -- that is, driver assist where the car can drive ~95% of the time, but the driver may need to intervene at any moment. So far all the studies I've seen indicate that the driver loses attentiveness whether they mean to or not, and the disengagement times are in multiple seconds.
Personally, I think a system needs to work well enough to drive itself the whole trip, or be dumb enough to act as cruise control with some crash prevention built in, but otherwise force the driver to be on task.
The problem isn't "you don't have pay attention to X" it is "you don't have to pay attention to anything during normal operation". It is human nature, if not outright physiology that it is incredibly difficult to keep someone engaged, and paying attention to a task that they are not actually doing.
The problem with halfway solutions like autopilot isn't that they control the car some of the time, it's that they can, essentially without warning Stop controlling it.
Self driving cars will happen. But the current model is essentially "do enough of the driving to maximize the likelihood that the human 'driver' is not going to pay attention, but don't do enough to make their lack of attention safe".
My statement isn't "cars allow you not to pay attention", it is "systems like autopilot actively encourage a lack of attention", that's backed up by the need for alert systems when they have obvious signs that the driver isn't paying attention, like "no longer holding the wheel". If the system didn't in general usage result in lack of attention then such a system would not be necessary.
That's absolutely wrong. A driver without assist will be fully engaged, while with assist they'll be tempted to divert their attention. Once attention is diverted they're no better than autopilot on its own.
Even if you buy the known nonsense of the human driver being expected to pay 100%attention at all times and be able to instantaneously take control why are you blaming the driver?
It’s called full self driving: even rudimentary crash avoidance from 10+ years could avoid this
I really don't like autopilot. It's good enough to make drivers trust it and not pay attention, but it's not good enough to not kill people when they do that. And when there's an accident Telsa come out and say "the system warned the driver to put their hands on the wheel" or something similar. Unless a car can 100% self drive drivers aids should require the driver to have hands on the wheel and be paying active attention at all times.
Why do we need a feature to steer for the driver or press their brakes? What is the new acceptable % of driver involvement in the activity? Is it now 95%? I don't understand why everyone seems to be OK with this grey area of man/machine responsibility. I've heard far more stories of lane keep assist doing something unexpected and causing panic than I have of it saving someone's ass.
Still, you can have lane control without all the rest of alleged autopilot problems, as well as automatic speed control and emergency braking.
The system is supposed to immediately start heading when it detects you're not responding.
Keeping hands on wheel is not good enough, an active attention system similar to train ones should be employed and sharper at that.
Something as simple as a steering wheel button which is required to press in half second after lighting a signal or automatic emergency system is engaged.
Perhaps even in all cars, not just automated, though it's hard to implement safely without automatic steering and lane system.
If you take the human factor away 5 seconds ahead of time warning is quite a lot. The car could have slowed down to a possibly non fatal speed. By asking the human to act the autopilot is actually throwing away at least 2 of those precious 5 seconds.
That's some PR firm-level explaining. First, autopilot is not just lane assist, it's lane assist + adaptive cruise control + additional features (it can change lanes if you tell it to). Other OEMs have purposely shied away from enabling an autopilot-like feature set together, even though it's technically possible, because they are absolutely terrified of something exactly like this situation happening.
Second, you're completely ignoring that autopilot will very much entirely drive the car by itself for minutes at a time (reports are anywhere from 5 to 15 depending on software version), which will obviously encourage drivers to not pay attention to the road. I don't care about disclaimers, I don't care about nag screens or chimes, and I definitely don't care about some warranty text that flashes on the screen. If you make a car that can drive itself you encourage drivers to let it do just that and you should be prepared for that eventuality. Period.
I think having partially self driving cars that require active driver engagement are a huge safety problem, because if drivers are able to just sit there and not doing anything 90% of the time, they just aren't going to pay attention ever. Some driver aids like automatic braking are probably fine, but 'getting from a to b' is probably something that shouldn't be available until it's as reliable as a human driver.
These are just another “feature” to compensate for inattentive drivers, like adaptive cruise control, auto emergency brake, lane assist and so on, the problem is that the less a driver has to pay attention to, the more bored and distracted will be.
This the problem; once you have to pay attention at that level (and I agree that you do to avoid accidents) whats the point of the automation? The taxing part of driving is not the physical act of slightly turning a steering wheel or break pedal but the mental processing power it takes to know when.
This "driver aid" model itself is starting to sound like a problem to me. You either have safe, autonomous driving or you don't.
A model where a driver is assumed to disengage attention, etc but then be expected to rengage in a fraction of a second to respond to an anomalous event is fundamentally at its core flawed I think. It's like asking a human to drive and not drive at the same time. Most driving laws assume a driver should be alert and at the wheel; this is what...? Assuming you're not alert and at the wheel?
As you're pointing out, this leads to a convenient out legally for the manufacturer, who can just say "you weren't using it correctly."
I fail to see the point of autopilot at all if you're supposed to be able to correct it at any instant in real-world driving conditions.
To a point. There is an inflection when the AI becomes a crutch that allows the human to stop paying attention. Tesla may be at/near this point. For instance: lane assist is great until it causes people to take their hands off the wheel, to stop looking at where they are going. I'd rather see such systems implanted as enforcement mechanisms. Let the AI prevent a car from drifting out of its lane. Let it monitor the lane position and scream at the driver when they start to drift. Don't let the AI become a comfortable crutch that allows the driver to take their mind off the task.
Best solution is to keep the driver engaged, IE still holding the wheel and showing they are paying attention. It's how all the other cars with lane assist do it. Cant go more than 30 seconds of not touching the wheel before it complains.
It's not perfect but it sure beats the driver being so used to it they start doing other things like watching movies on their laptop.
I agree, but the problem is if you are not controlling the throttle and not controlling the steering wheel.. what are you doing? I find it tedious, but not particularly hard to drive for 12 hours in a row. On road trips most people I travel with fall asleep within a few hours if they are not driving.
Seems human nature to watch the car avoid 99 problems, and fall into a false sense of security and stop paying attention. Sleep, cell phone, talking to a passenger, or just day dreaming seems ever more likely the longer you are letting the car drive.
It's 2 people so far that lethally broadsided an 18 wheeler with a tesla on autopilot?
So an attentive human + FSD sounds great. Not sure humans are going to be able to stay attentive though. Sure at some point inattentive human + FSD will be safer than a stand alone human, but it doesn't seem like that's happened yet.
Telsa seems a bit evasive with the safety data. Maybe it's because human with autopilot off (but safety braking on) is just as safe as human with autopilot on.
I don't think most people realize how erraticly they drive.
But five warnings in 30 minutes could just be 15 or 20 seconds of inattention. So more than 99% full attention. A driver assist that just smooths things out only 0.1% of the time could save the day a lot.
reply