I agree, but the problem is if you are not controlling the throttle and not controlling the steering wheel.. what are you doing? I find it tedious, but not particularly hard to drive for 12 hours in a row. On road trips most people I travel with fall asleep within a few hours if they are not driving.
Seems human nature to watch the car avoid 99 problems, and fall into a false sense of security and stop paying attention. Sleep, cell phone, talking to a passenger, or just day dreaming seems ever more likely the longer you are letting the car drive.
It's 2 people so far that lethally broadsided an 18 wheeler with a tesla on autopilot?
So an attentive human + FSD sounds great. Not sure humans are going to be able to stay attentive though. Sure at some point inattentive human + FSD will be safer than a stand alone human, but it doesn't seem like that's happened yet.
Telsa seems a bit evasive with the safety data. Maybe it's because human with autopilot off (but safety braking on) is just as safe as human with autopilot on.
I agree with you right now, but is that still an issue for self-driving cars? Ideally I want to get in, say drive to work, and then take a nap. If I still need to pay attention then it's not really a self-driving car just an upgraded form of cruise control.
But IMHO it's not full self driving if it requests the driver to take over even once.
If there's an insane storm or something then it's ok for FSD to know it should disable and then you have to drive 100% control. The middle ground is more like assisted driving which doesn't seem safe according to most HN comments.
Do people really stare at the license plate in front of them anyways, though? It was made quite clear during driver training courses many years ago not to do this and that your glance pattern should always start and return to near the horizon (whether that be hundreds of yards in a city, or miles away on a highway).
I've always had the opposite problem, if I'm not maintaining situational awareness to the horizon, I'll inevitably fall asleep after an hour or two out of sheer boredom. I prefer to drive long distances in manual-transmission vehicles (there's more to do!), and commercial vehicles with loads requiring careful management of engine RPM are even better. Those I can drive for 10-12 hours in a day without issue. I never saw myself as anything other than an ordinary driver.
In my case, a Tesla autopilot is the worst possible compromise. It removes enough attention requirements that it feels much harder to stay attentive, but without automating the process completely. Other people I've talked to that operate large pieces of mechanical equipment have often said similar things about automation that it's best to either be a) fully manual or b) fully automated for some duration of time.
That is what Tesla and other car manufacturers are doing essentially, isn't it? However, this will always require a human to be paying attention. Highway driving is boring until it suddenly isn't. Driving perfectly on a highway is just as difficult as driving perfectly in a city.
As automation becomes better humans will be paying less and less attention to the road, making this technology somewhat dangerous in the interim. There has already been a fatal accident involving a Tesla on autopilot where the driver was watching Harry Potter.
According to the TOS, and given Tesla's statements, the system needs to be supervised by a 100% attentive human 24/7. From what we know of human physiology, and what we have observed from previous self-driving crashes, this level of attentiveness is impossible to maintain... Without being the one actively driving the car.
The best way to stay attentive behind the wheel is to actually be the one driving. There's virtually no way of being both attentive and passive for long periods of time. If a driver can't check their email or whatever on their phone while autopilot is on, then autopilot is not safe to put in cars. And while I'm okay with not-exactly-safe for most things people willingly consume or use, driving is not an area where it's okay to roll out a feature that may cause people to stop paying attention to the road.
From my point of view, the actual act of driving is not that difficult (after the first 50k miles or so). The issue is the mental effort required to continually pay attention to what's going on to drive safely.
Tesla's system still requires me to pay attention to what's happening to exactly the same degree as normal because I might need to intervene (and in fact makes it harder to be ready to do so). All it does is take away the (to me) trivial aspects of pressing a pedal and turning a steering wheel.
Whereas this system introduces a set of circumstances in which I don't need to drive at all. No need to pay attention at all. And it's the most tedious form of driving there is - stop start traffic on a motorway.
That's the goal of autonomous vehicles. Not autopilot. Surprisingly, not "full self driving" either. In both of those you have to be fully alert because the car can do something very dumb any moment.
FSD and autopilot on highways are still bad enough that sober people are mostly going to be paying attention. I think this approach is doomed though, there is going to be a dangerous time where it's too good to pay sustained attention but not good enough that you don't have to pay attention. the way Cruise and Waymo are doing it seems much much safer.
The problem isn't "you don't have pay attention to X" it is "you don't have to pay attention to anything during normal operation". It is human nature, if not outright physiology that it is incredibly difficult to keep someone engaged, and paying attention to a task that they are not actually doing.
The problem with halfway solutions like autopilot isn't that they control the car some of the time, it's that they can, essentially without warning Stop controlling it.
Self driving cars will happen. But the current model is essentially "do enough of the driving to maximize the likelihood that the human 'driver' is not going to pay attention, but don't do enough to make their lack of attention safe".
My statement isn't "cars allow you not to pay attention", it is "systems like autopilot actively encourage a lack of attention", that's backed up by the need for alert systems when they have obvious signs that the driver isn't paying attention, like "no longer holding the wheel". If the system didn't in general usage result in lack of attention then such a system would not be necessary.
Telling people to remain alert will do no good because it is simply asking for too much. A person is supposed to sit still, not driving but maintaining their full attention while being ready to automatically take over in an emergency situation. This will lead to the same failure of vigilance over time we see in guard duty and it really can't be helped. Task unrelated thoughts crop up more frequently for simpler tasks; with few stimuli, low surprises and little to attend to, the brain will begin with experience replay (this is actually more a machine learning term but it's appropriate here).
Driving already leads to mind wandering states; it is overwhelmingly likely that the passive aspect of autopilot will lead to mind wandering at even higher rates. Asking a less practiced driver to shift attention from internal to external states very quickly and then make complex judgments is simply not fair. A simple physics and statistics based model will be far more reliable. If it isn't, then Google's strategy of shooting straight to level 4 makes sense.
> The public's perception and 'common sense' understanding of technology is sadly very limited
Common sense is contextual and one of the more complicated aspects of cognition; it depends on the level of detail in the model being used to make inferences. A model's sophistication is dictated by internal preferences and goals. If most people's understanding of a technology is limited, then they're going to be doing what looks like averaging over distinct possibilities to a more informed model.
It doesn't help that if you know nothing about technical uses of the term autopilot but do know something about words (which will be the case for most) then in truth, it is the aviation industry that has misnomered.
Even if it was on, failing to act would be the problem because you are supposed to pay attention while it is on. So, it's by definition a human error.
And with some drivers being a danger to themselves and others, you might legitimately wonder if fsd wouldn't be the better alternative.
People die every day on roads. Mostly because they are being people and get drunk, tired, distracted, reckless, etc. It's very likely the driver ticked one or more of these boxes. The simple explanations are usually the right ones.
the issue is that randomly requiring non-professional drivers to take over during a trip is an underestimated danger. people get accustomed to it working 99% of the time and get lazy - then the 1% it fails and the driver is asleep or distracted or watching a movie and becomes a major safety issue.
ex all the horror stories of distracted Tesla drivers using autopilot like full autonomous driving today.
I don't really care what it's named. I care about the implications of the system in the real world. Humans are notoriously bad at paying attention to boring tasks. This applies to pilots as well as ordinary drivers (thus the massive amount of training, redundant operators, and very details operators manuals/procedures that apply to ATPs). Tesla released a system that encourages people to operate their car with less than 100% focus. That's bad.
I'm not aware of similar reports of crashes/deaths with systems like Subaru's EyeSight or Honda's SensingSuite. Or, on the other end of the spectrum, Mercedes's Drive Pilot.
Exactly. Some of these cars do not even have 'Driver Monitoring', which means the car doesn't even track if the driver has their eyes on the road at all times, which puts many other drivers at risk.
On top of that, FSD is still admittedly Level 2; Not exactly 'Full Self Driving'? And the controls can easily be tricked to think that the driver has their 'hands on the wheel' which is not enough to determine driver attentiveness while FSD is switched on.
Anyone who drives a car is technically, legally, and ethically responsible for being capable of operating the machine and to do so in a responsible manner.
Tens of thousands of people a year fail to do exactly that. Tens of thousands of fatalities every year, not to mention maimings and injuries, countless billions of dollars in property damages and lost wages, because humans generally fail in their legal, and ethical responsibility to operate their motor vehicles responsibly and in conditions they could handle.
There have been a few cases where Tesla drivers have made the same failing. Sometimes the AutoPilot feature was engaged at the time. Sometimes they were just pushing the vehicle too hard and ended up on the wrong side of physics. The Tesla is stupidly fun to drive and sometimes I drive more spiritedly than I should. Being too fun to drive could get people hurt or killed too (and it has)
I strongly believe a responsible driver is more safe with than without an AutoPilot feature, mostly from my own personal experience, as the data point I used to cite is somewhat controversial.
Just like the amazing cornering and torque of the Tesla can be abused and even lead to fatalities when pushed too far, so can the AutoPilot.
I personally think it’s a mistake to make the feature significantly less useful to a responsible driver to try to possibly prevent these edge cases. If it were possible to make the attentiveness features entirely non-intrusive than absolutely they should be added. But in reality the attentiveness features are already intrusive to a responsible driver and detract from the experience.
Interestingly Volvo seems to be going in a different direction. They believe as the manufacturer it’s their responsibility to create a product that even a human attempting to operating irresponsibly or illegally should be kept safe. They’re adding hard speed limits well below the functional limit of the hardware, and contemplating even systems like breathalyzers and fatigue detection which would entirely disable the vehicle.
I personally don’t want to live in a world where every product I use is sizing me up and deciding how I should use it, whether it’s a chef’s knife, jet ski, automobile, or semi-automatic.
In the meantime what I love most about AutoPilot is how it can only possibly get better over time, and every single car in the fleet is benefiting from that. That’s as long as the regulators don’t fuck it up.
I'm in such a vehicle too, and I think you're crazy to want that. If they spend 6 hours unnecessarily staring at the road while autopilot is on, they're not going to be anywhere near peak driving condition when they do take over.
"I have no trouble staying alert this way when doing medium long drives. Long highway drives where autopilot is so good that it requires no manual interaction is where the trouble starts and I find it hard to keep paying attention."
Exactly!!!!!!
The trouble is when your attention wanes and you don't know exactly when you should be paying attention. What I found is the small amount of automation my car does is wonderful to keep my exhaustion down. When I drove for 4 hours in my 2012 prius, it was a chore, i had to do ALL the mental math myself. When I drive my 2023 CX-50 I have to pay attention at least 30% less, and that 30% is a massive difference, it feels like 90%. And any time my attention wanes, the car starts complaining because I am drifting or not putting enough immediate response to the road. It becomes a "pay attention quickly" wake-up call that happens within seconds of attention waning.
The worry is when the car doesn't snap you back into that attention mode, and you just trust it, right up to the problem.
Circumventing the attention monitoring systems isn’t the issue. Those only exist for the purpose of liability shifting, and serve no other purpose.
We _know_ from innumerable studies, and just direct experience in aviation that saying “this will be done automatically” but also requiring absolute attention is not something humans can do. The only people trying to claim to the contrary are the self driving car people, and only so that when their self driving system fails they can say “the person in the driver seat is the person actually driving and it’s their fault, not our faulty software”.
This is without considering their other failure mode: disabling without warning, often immediately prior to crashing. Autopilots in aircraft can do “something’s wrong I’m disengaging and warning the pilots” because any time the autopilot is engaged and the pilots are not actively engaged with flying the aircraft they’re at altitude and have significant amounts of time to react. The few events that do require immediate responses have extremely constrained responses: essentially push down or pull up (ground proximity, stall, or TCAS alerts) - none of which is the case for road vehicles.
The time on a road vehicle between something going wrong, and impact, is generally less 3 seconds (I recall Tesla’s giving 2 seconds of notice). In that time the now driver has to become aware of the change in state, develop situational awareness, and then start to react. Then the actual correct course of action has to actually complete, which also takes time.
IMO self driving system that hands off control of the vehicle less than 10s prior to an accident occurring that it is responsible for is the fault of the manufacturer. Obviously a self driving system can’t be held responsible for an accident caused by another vehicle.
If the manufacturer’s self driving system is unable to recover from whatever is going on, it’s reasonable to give up and offload to the human in the driver’s seat, but it not reasonable to then say that they were responsible for an accident if they are unable to recover.
Again, all these nonsenses trying to avoid being in charge of the car are simply symptoms of the actual problem which is that self driving cars today are unsafe. It’s obviously fairly awful of those people to knowingly circumvent safety systems of something that they are aware is already unsafe, but it’s the manufacturers fault for selling a product that is unsafe in the first place.
Seems human nature to watch the car avoid 99 problems, and fall into a false sense of security and stop paying attention. Sleep, cell phone, talking to a passenger, or just day dreaming seems ever more likely the longer you are letting the car drive.
It's 2 people so far that lethally broadsided an 18 wheeler with a tesla on autopilot?
So an attentive human + FSD sounds great. Not sure humans are going to be able to stay attentive though. Sure at some point inattentive human + FSD will be safer than a stand alone human, but it doesn't seem like that's happened yet.
Telsa seems a bit evasive with the safety data. Maybe it's because human with autopilot off (but safety braking on) is just as safe as human with autopilot on.
reply