Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Not at all, because I know that if the driving conditions change and I have a feeling the car won't do well, I take over. In other instances, where I know the car is going to do just fine, like driving on the freeway, I enable it.

FSD/Autonomous driving isn't an all or nothing scenario for me. I use it when I deem it safe to be used, and drive manually every other time.



sort by: page size:

Never. Either I drive, someone else in the car drives (aligned incentive), partially autonomous (with me taking over) or fully autonomous.

I'd take all four of those options but never, ever remote drivers.


Exactly, the only driving assistance feature I use is adaptive cruise control, and I don't have plans to use anything more. If I trust autonomous systems too much, I would not be ready when it matters.

The driver should know the capabilities of the car. If he doesn't, then he shouldn't drive it. If he isn't 100% sure that autonomous driving system will work as expected, then he should never enable that system.

Unfortunately for anyone relying on so-called "full self driving", that it may be capable of doing much of the driving you remain the driver.

It's not even a question of "taking back control". You're expected to be in control of the vehicle.


I'm with you on the second point strongly.

But IMHO it's not full self driving if it requests the driver to take over even once.

If there's an insane storm or something then it's ok for FSD to know it should disable and then you have to drive 100% control. The middle ground is more like assisted driving which doesn't seem safe according to most HN comments.


Well yes, it makes sense that you shouldn't be required to take over whenever the car decides, but I'd still like to be able to start up the car and drive it in manual mode if I so desire. I get a lot of pleasure out of driving, and I don't want to give that up even if I want to let my car pilot itself sometimes.

I suppose I would have to experience first hand to really weigh in on whether I would use it or not, but from an initial reaction standpoint, I would think I'd prefer an all or nothing approach. Meaning, either fully automated driving or don't help me at all. I just can't help but imagine that I'd never be able to get used to this type of selective assistance. Someone cutting you off doesn't necessarily mean you're headed for a crash and no cars in front of you doesn't necessarily mean you can drive 80 mph without caution. If you grow to rely on a system such as this, it only takes one incident where the vehicle doesn't believe it's about to be in a crash when you really are for it to ultimately fail. Or worse yet, for the car to sense you may be in a crash and apply counter measures which causes panic by the driver and ultimately leads to a crash anyway due to overcorrection or something similar.

Not at all. I'm sure that with practice i could outperform the car. But why bother? My view of the automation in my car is that together we both drive better than either of us do on our won.

If the automaker advertise a level of automation where I am not supposed to do anything while the car drives itself (which seems to be the case), of course I blame it in case of accident: what could I have done it to prevent it, except for not using the feature at all ?

If instead we talk about driving assistance and I am supposed to keep an eye on the car, I am fine with being responsible for any accident.


It's not just you. It's a well-researched phenomenon.

I do think there's the potential to enable full autonomy in well-defined use cases while requiring manual control elsewhere. (Still problematic--how do you enforce?)

But the idea that a driver can take control on short notice from an automated system is a non-starter. Maybe you can assume there's an adult with supervisory control in the vehicle but you certainly can't assume they're prepared to deal with urgent emergencies.


It might not actually matter. Since the car can operate autonomously already, the operator doesn't necessarily need to literally drive the car. They might simply need to hop in to verification of actions in unusual situations.

I'm imagining a situation where a car comes across a parked truck on a one-way road (common in cities). A human operator comes in the loop to ensure that it's actually safe to switch lanes and pass. Check for things like emergency vehicles, unusual pedestrians, etc. They don't need to literally take the wheel, just confirm that the vehicle can take a specific action.


No, I mean what's the point of turning on autonomous driving mode if you have to keep your hands on the wheel and your eyes on the road at all times? That's worse than actually driving the car yourself.

Honest question, do you feel you can safely take back the control of the car when driving on a motorway if the "auto pilot" feature take a random decision you cannot predict?

I don't put any faith in it at all actually. That's why I pay attention while it drives, looking for novel situations, which would include self driving errors, so I can correct them.

You're assuming that you won't already be treated as the responsible party as the "driver" even in an autonomous vehicle.

I wouldn’t trust any driving automation to handle a situation that I couldn’t handle manually, especially if it relies only on sensors equivalent to what I see.

These automations are extremely useful to relieve long and easy drives. Not to handle difficult conditions better than a concentrated driver.


From the stories that go into detail about google's self-driving cars, it really seems like at this point the driver-take-over situation is really more like a breakpoint than an actual safety valve. It's there for when the computer has multiple reasonable options and the driver's judgement and awareness is needed to help resolve the ambiguity not just for that ride, but for future rides as well (as a sort of feedback).

I'd suspect that it's harder to take over and avoid a colision in a self driving car, than a car that you're continuously in control of. You first have to recognize that the system is failing or about to fail. And that has to happen well in advance for you to take appropriate action. Doesn't seem like a reliable failover procedure, even if the person behind the wheel is paying attention.

No. You agree that you are in control of the car.
next

Legal | privacy