Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

>What I am against is the premise that people are so dangerous and terrible at something they've been, mostly, able to do safely for about a century that their freedoms need to be taken away.

People are dangerous and terrible drivers. There are 30k vehicle accident deaths each year in the US (1.3 million world wide), and it used to be much higher (especially per mile driven). Number of deaths per mile driven has been going down drastically in large part because technology has limited a driver's direct control of the vehicle.

> Autonomous cars are fine - trying to take for granted that personal freedom is dangerous and stupid and that people need to be controlled like herd animals is not necessarily a good thing.

ABS/traction control/drive-by-wire takes direct control away from the driver and gives it to a computer. Self driving cars are just the next layer of abstraction above this. You're still in control of the car, it's just that the interface is a touchscreen with a map instead of a steering wheel.



sort by: page size:

> On the subject of danger. Literally everything we do is dangerous to some degree.

I don't disagree with your take and I'm a self driving car proponent, but I'm worried about what process we take to get there.

One thing I've taken away from the pandemic is that people seem to have no problem imposing their tolerance for risk on others. Seems like we are on a path to play this dynamic out again in how self-driving cars come to market unless that safety profile is really well controlled and understandable.

Even if at a population-level self-driving is slightly safer statistically than person-driving, there are enough edge cases to give me pause right now, and at the individual-level it may raise my risk either as a pedestrian or driver and certainly changes what is predictable behavior [1].

[1]: https://arstechnica.com/cars/2021/04/why-its-so-hard-to-prov...


>> If we can have self-driving vehicles that are twice as safe as the average human, then they should be allowed on the streets.

Absolutely disagree with this approach. Self driving cars can't be merely slightly better than your average human - they need to be completely flawless. Just like an airplane autopilot can't be just a bit better than a human pilot, it needs to be 100% reliable.


> A lot of us are skeptical of self-driving cars because a lot of their advocates want the next step to be banning private citizens from driving, which you've just confirmed, thanks.

Lol, i think you're mistaking my (attempted) logical thinking towards some type of agenda towards banning humans driving cars. You seem far more agenda driven, heh.

Just because i think we are extremely unqualified for driving, does not mean i want to ban the act. It be extremely improbable to achieve, with all the costs of upgrading the entire infrastructure alone making your defense of this almost seem like a half hearted joke. Moreover, even if in some magic realm where we can make all cars into self driving cars to abolish human driving, i'm just simply not in favor of removing human rights.

Yes, i would want to make it difficult for you to drive, but for your own (and mine!) safety. Our safety standards for driving tests are incredibly, and i mean incredibly, laughable. How often we enforce rules and laws for driving are also laughable. Almost no one even follows some of the most basic rules, like speed limits, and the more extreme people often break sobriety laws.

Self driving cars would give us a platform to A: Raise the bar for who can drive, and what sort of training you need. and B: Be more willing to ban someone from driving if they put others at risk, through speed, drugs, alcohol, or whatever.

We have a hard time banning drivers these days because it very negatively impacts your life if you cannot drive. In a world with self driving cars though, you can still have a first class life. Driving is now something of leisure.

Anyway, nice try at putting words in my mouth - next time make them a little less tinfoil though, please.


> I think the problem we have with self-driving cars is more social than technological at this point.

> it's a hypothetical so give me some leeway on this!

IMO you should not base (and broadcast) your opinions about safety on hypothetical statistics. I don't even believe it's true that overall statistics show self-driving is safer than humans. IIRC prior reports showed that companies were selectively picking statistics about safety.


> I don't believe that partially automated driving, as shown by Autopilot is fit for human use. People will rely on it as if it's fully automated and more accidents like this will happen.

Keep in mind 125 people are killed on the roads every single day in the US. If all vehicles magically switched to this system overnight and the system only killed 100 people per day, that would be a huge win.

The fact some people will be killed by automated driving systems is not enough to ban them. If it was, we would ban all cars.

The only important statistic is if those systems kill less people than the current state of affairs.


> The odds of me dying in a car crash is lower than the average as a result of precautions I take to be safe.

I'm not specifically accusing you of this, but consider that more people than is numerically possible believe that they're better/safer than the average driver. There are a lot of people who believe they are much safer drivers than they actually are.

Regardless, just because you believe that you personally will be a safer driver than a computer, we should scrap the whole thing? What about all the people who aren't better drivers than the computer? Let's assume for a moment that you actually are safer than the eventual self-driving systems that are approved for general use -- which is by no means a certain assumption to make -- then maybe you just don't use or ride in a self-driving car? It's your choice, after all (especially in a place like the US, where I imagine manual-drive car ownership in a self-driving world will end up nearly as closely protected as firearm ownership). And sure, maybe someone else's self-driving car might hit you and kill you, but someone else's human-driven car might do the same. And if self-driving cars are doing that at lower rates than humans are, it's still a net win.

I think many people are taking this weird view that even though a self-driving car might make fewer mistakes (and cause fewer deaths) overall, it's somehow a worse situation that they'll likely make different mistakes than a human would; that is, a self-driving car might kill you in a situation where a human driver would save you. And that somehow makes the whole thing not worth it. I just find that line of reasoning to be flat-out wrong. It's an emotional appeal to some illusion of control. (Of course, unfortunately, logic doesn't write laws when it comes to contentious issues... emotion does.)

> The focus on potential glitches is because it's something the driver has no control over.

This is pretty short-sighted, because there are a ton of things that you have no control over when you drive your own car, and yet you've decided (in many cases likely unconsciously) that those things are acceptable risks.

I'm not saying you should ignore the possible risk of glitches, but focusing on a number that we don't even know yet, and immediately assuming that it will be too high for your risk tolerance is... a bit weird?

And that's the thing: I don't expect self-driving systems that have equal or worse crash records than humans do will be approved for use. And if they are, people will (rightly!) reject them. So any approved, accepted self-driving system will end up causing fewer deaths. Some of those deaths will be caused by outright bugs, and others will be caused by situations that a human driver would not be able to recover from either. All deaths are tragic, but fewer deaths overall is what we should be -- must be -- aiming for. Not playing games with control illusions. Not arbitrarily deciding that certain failure modes are somehow less acceptable than others when they cause the same (or even fewer!) deaths.

My position -- and what I believe to be the only logical, community minded position -- is that the glitch rate does not matter one bit. The only thing that matters is the overall death rate, and if self-driving cars have a lower death rate than human drivers, that should be enough. And if they don't, they should not be approved for use, and people will rightly reject them anyway.

I do agree with you that companies building self-driving systems need to be liable for mistakes and negligence to the same degree as human drivers are. Unfortunately that's harder to prove, but it's a necessary thing to figure out.


> You still have to stay on the roads and use a seatbelt and use your turn signals 100% of the time, etc.

No, you don't. If you're not on a road, there's no need for any of those. You don't even need a license. Should those usecases be taken away?

> They would be building self driving cars if they cared about safety.

Imagine, for a moment, a car with all of that data that it collects distilled down into a HUD for the driver. Can you honestly tell me that it wouldn't be better than the status quo? Should we not ask for such a thing, since it doesn't remove the human from the equation?

There will never be a 100% safe automatic pilot - putting off adding safety features to cars today until we get that perfect system is, to me, really quite silly.


> My gut feeling is we're perfectly happy with self-driving cars that are even substantially worse than human-piloted ones, so long as they crash the way humans do. Rear ending in the dark, okay. Barreling full speed into a highway divider [1] or dragging pedestrians to get out of traffic [2], not.

No. I’m very not ok with this. Many human drivers are terrible. Most collisions are entirely avoidable, and I certainly don’t want more of the same from computers. If I keep getting my life put at risk more often by incompetent driverless cars, I’m going to be pissed! Maybe people that spend their time in cars agree with you, but as a cyclist and pedestrian the common accident modes are the ones that pose the greatest risk to me. Pinball off the jersey barricades all you want, but stop passing bikes too close, and plowing through crosswalks just because you can get away with it.

I’ve been hit twice on my bike, both times 100% the driver’s fault. The stakes are life and limb for me, so I absolutely do not accept a higher accident rate.


>It seems to me that self-driving cars can't be just as safe as the status quo, but have to be far far better. It's an unreasonable need, but human nature.

What would be the point of self-driving cars which were no safer than the status quo? That's just removing freedom from drivers and adding more complexity to infrastructure for no added benefit.

It's also exactly the narrative that proponents of self-driving cars have been driving (pun intended), self-driving cars would eliminate all, if not nearly all, accidents and fatalities.


> In their current state self driving cars would kill far more people than humans if all cars used self driving 100% of the time, and no one knows if we will ever move past that.

Really? No one knows if we will ever improve on self-driving cars than the current technology today? I'm going to stick my neck out and say "Yes, I know that will move past that."


> There’s a strange amount of anti-car propaganda that has gotten people worried about this, but I look forward to a driverless future in which cars are cheap, clean, safe, and available to all.

It’s not propaganda but jumbled concerns which are often poorly expressed. I think the strongest arguments are:

1. Self-driving cars don’t change pollution - even EVs are better for local air quality but still cause massive carbon emissions and unchanged or worse tire particulates, etc. – and may even make it worse locally with the extra mileage from taxi fleets.

2. Self-driving cars only lightly improve congestion, and then only to the extent that they can coordinate and you can ban non-AI drivers from certain chokepoints at certain times. The form factor unavoidably needs far more space per passenger than anything else.

3. Self-driving cars don’t really help with affordability – even if the current prices come closer to parity, that’s a financial stress for many people (e.g. in the region where I live, the average family spends as much on vehicles as they do food).

4. Self-driving safety needs a different relationship with the manufacturer. There are many areas where they can be safer but failures can also be correlated so we really need companies to share liability and have rigorous safety oversight.

As a pedestrian, I’m fairly bullish on the concept given how dangerous the average driver is now compared to 20 years ago but I worry that a lot of politicians are going to ignore the other issues because those require hard choices whereas it’s so compatible with American culture to say you can solve major problems by making an expensive purchase. These shouldn’t be opposing issues, of course, and I’d really like to combine them because autonomous vehicles should soon, if not already, be much better about following speed limits, staying out of bus lanes, etc. Making advanced automatic braking a requirement to enter a city could save thousands of lives every year.


> all self-driving cars have to do is be a little bit better for net safety to improve.

That's true abstractly but ignores several important real-world factors about the adoption of self-driving cars.

On the one had, autonomous cars have to be a lot better than humans to prevent these sorts of PR trash can fires or they won't be given the opportunity to improve net safety.

On the other hand, people are so bad that we're liable to soon live in a world of autonomous cars, regardless of the effect on net safety.

I hope they can be made safe, because it's vital for the future of our car-obsessed culture. But I don't have as much faith as you.


> How many people are killed every day by human drivers? Would it be better if all motor vehicles switched to autonomous and the death by automobile accident rate drastically lowered, but was still greater than zero?

The problem with this is the assumption that autonomous motors are good enough that switching everyone over would even be a net win. Accidents are already rare per mile, so evaluating safety is hard. And the makers of the autonomous vehicles are incentivized to fudge their data as it's potentially a massive market.

That's never mind the fact that there's no clear quality control on what counts as autonomous. As an extreme example, a shitty Arduino app hooked up to servos and a single webcam could count and it almost certainly would not be safer than humans.


> If you're going to go down that route (no pun intended), why not eventually ban everything but officially certified driverless cars from public roads?

I agree. That's the way its going to go. People kill 40K/people a year in the US simply driving, and injure/maim hundreds of thousands. There's no way self-driving cars aren't better than that.

Want to build your vroom vroom car? Own the entire stack down to the atoms? You'll get to drive it at track day at a track, not on a public road.

> Personally, I prefer the latter even if it means I could get killed at any moment because the risk is all part of the experience; not only of driving but really just life itself.

Agree, but that sentiment will die a slow death over the next few decades, just as those fond of the horse and buggy are no longer with us.


>> it's unacceptable for automation to produce a worse result than a human in the same situation, with the same information

> I don't think anyone disagrees on that.

I disagree on that. If there's an autonomous vehicle that is better than a human in most situations, and worse in a few situations, such that the overall accident/death rate is lower, and there is no reasonable away to identify the rare dangerous situations in time to disable the autopilot, I would want to drive that car and would advise others to do so.

In fact, if there was an autonomous vehicle that was almost exactly as safe as a human but slightly more dangerous (say, a 10% higher death/accident rate), I would frequently use it because the large benefits outweigh the minor statistical costs. (Indeed, I use a car at all because of its benefits over walking, busing, or staying at home, despite the higher rate of death.) If other people understood the risks, I would also suggest that they to do likewise.


> I have trouble fathoming a position of skepticism around the benefits of self-driving vehicles.

I don't have trouble being skeptical, in fact. I don't believe it will ever work for city streets.

Here's why:

The computer would have to be 99.99999% reliable to do that, and we don't have computers that reliable.

The numbers:

The accident rate is around 74 per 100 million miles (and fatalities is 1.13).

It's unclear exactly how to turn that into a percentage, but no matter how you do it it's quite high.

Say an accident takes 5 minutes, and people drive 30 miles/hour. Then that works out to 99.999% for humans. If you use the numbers for fatalities then it's 99.99999%.

I.e. 99.99999% of the time, as whole across all [US] humans, people drive in a way that does not cause a fatality.

That's the bar computers have to cross in order to save any lives at all.


> You know what driverless cars can't do? Redirect a passenger to the emergency room due to acute problems that occur in the car.

You state this as a fact with absolute confidence. That's interesting. Not only do I respectfully disagree, but I'm actually surprised you are so sure of yourself. It seems (to me) rather obvious that self driving cars are not a mature technology yet. In fact, they aren't even deployed widely. In that regard, of course it isn't a mature technology!

So I'd really appreciate it if you could help me understand how you arrived at that conclusion. Being so incredibly confident you must have a rock solid set of arguments about this topic and I'd love to hear them.


> I am not sure if an autonomous car will be safer. I mean, can they even do panic breaking right now (For example, if someone or something jumps in front of the car?). You know, safety is often not about having the fastest reaction time, but also about having good anticipation that any decent human driver will develop in a short time.... I think any hopes that cars will be better than humans at it is quite naive...

Even if we assume that cars won't be as good at stopping from people jumping in front of them, I would bet the number of accidents that happen from something like that are significantly smaller than the number of accidents from human error like texting while driving, being drunk, or just not paying attention due to fatigue or whatever else.

Not to mention, how can you really even tell the difference between someone who's walking towards your car but will stop and someone who isn't? Not even humans can do that since we can't read minds.


> This "kind of autonomous driving" is absolutely deadly, by definition. It's never going to work well.

Can you quantify this statement? All of the statistics suggest that it is in the same ballpark of safety as fully manual driving.

> I think it is cognitively harder to use autopilot and pay attention, than to simply drive safely.

Everyone is entitled to their opinion, but don't you think we should look at what the actual data says about autopilot safety (including for the various flavours of autopilot tech)?

next

Legal | privacy