If someone drives around with loose crap in the back of their truck that flies out, the truck driver is at fault not the self driving car. This is a danger we already have on the road.
Humans make the wrong call on this sort of thing all the time. They also make stupid passes, yield the right of way at the wrong time, drive the wrong way down one way streets, cut each other off, fall asleep at the wheel, drive drunk, drive without their glasses on, get road rage, etc etc etc.
There is this sort of one-way lens when it comes to self driving cars. People want to throw up red flags about all things they might do wrong while ignoring the millions of stupid things that humans do to kill each other with cars every single day.
The problem is, a human driver wouldn't have had any trouble avoiding that accident -- the truck was visible for some distance.
It's not just about the overall fatality rate per mile. Self-driving cars also need to not have any accidents that humans would have easily avoided. Until they demonstrate that, they're not going to be widely accepted.
I can't say I'm a fan of self-driving cars, but these days, with everyone texting and talking on their phones while driving, I'm not a fan of people-driven cars either.
Of course there are going to be mistakes with self-driving cars, and they will kill people. But at some point - and maybe we're already there - self-driving cars are likely to be safer overall than distracted people attempting to drive. If a self-driving car kills or injures someone, it is all over the news as a major problem. In that same news story, they should also tell us how many human-driven cars cause death and injury because of human driver mistakes.
This is what people don't appreciate when quoting those statistics about how self-driving cars are safer than humans: when a human driver causes an accident, it was because that particular person did something wrong. When a self-driving car handles a situation wrongly that's a big issue, because all the self-driving cars run the same software.
The problem is with humans it's fairly clear most of the time who made the error. It's either human 1 or human 2 99% of the time. With self driving cars it becomes software 1 or software 2... And at that point it's extremely difficult where to point the blame.
The comment that humans are sometimes wrong too is absolutely ridiculous and I always think about the self-driving car incident in china where the car mistook an advertisement of a car above the road as a real car and hit the breaks causing a chain crash. Even the shittiest driver would never do that.
I don't buy the idea that in a world where self-driving cars are safer, we should have 'less safe' human drivers just because we can blame them when they kill someone.
Car manufacturers shouldn't have to show something is perfectly safe, and IMO they should just have to show that something is reasonably safe (e.g. as safe or safer than a human driver in the same condition).
There isn't a requirement that someone has to take legal responsibility for every accident that happens in the world, just that people act with reasonable care and attention within the law.
And several people died in human-caused wrecks while you were typing that sentence. Nobody seems too hung up on that. Not scary enough, I guess, and you can't blame one or two large monolithic corporations for it.
Self-driving cars are one of those instances where painstaking adherence to the precautionary principle is going to get a lot of people killed due to poor risk modeling. This article is a great example, where the reporter has engaged in elaborate rhetorical gymnastics to paint an incident arising from a human driver's poor judgement as the fault of a "rogue engineer."
The problem, though, is that as soon as a self-driving car crashes and kills someone, people will declare self-driving cars to be evil and unwanted -- even if they're actually safer than human-driven ones.
Assuming people are ignorant en masse says more about your understanding of people than the people.
Not all self driving car accidents make the news — self driving car accidents that any human diver could have easily avoided make the news. Statistically safer something something miles driven means nothing to the family of someone who died in a completely avoidable accident.
Until self driving cars reach the point of handing emergency situations — storms, ice, obstacles, children, other accidents, sudden lane changes, tire blowouts — better than humans and stop punting anything less than ideal conditions back to the driver it’s pointless to talk about safety.
People expect self-driving cars to be better than human drivers and not just statistically better. Any accident a self-driving car gets into that most humans would have avoided should be considered a fault of the self-driving system. There shouldn't be wholly new ways to get killed on the highway and us just shrug it off as yeah but fewer people are dying overall. That's not gonna fly.
If you don't trust the abilities of other people, who do you think is currently driving cars? You are trusting everyone else on the road with your life every time you get in a car.
You ever look at other drivers when you are stuck in traffic? You will see people looking at their phones. You will see people eating. You will see people shaving. You will see people doing their makeup. You will see people reading books. You will see people with dogs on their laps. And that is just the people who might accidentally kill you due to carelessness. Vehicular suicide is more common than most people expect and there is no telling if you might be collateral damage in one of those collisions.
Self driving cars don't have to be perfect to be dramatically safer than human drivers.
Yup. Humans are allowed to make mistakes and fail fatally. If self-driving cars make a single fatal mistake, then watch out for negative press coverage for weeks. Why? Because every failure can be traced back to a root cause. Every root cause (other than something like another driver literally trying to kill you) looks like an incompetent screw-up after peeling back all the layers. In principle, self-driving cars are basically perfectly safe (as a system). So every failure causes righteous fingerpointing.
...which is also partly the motivation to improve.
I expect self-driving cars long-term to be a lot like air travel. It'll have a reputation for being super unsafe, with lots of people paranoid of it, for decades after it has far eclipsed the safety of regular human car driving. The paranoid hyper-focus and ridiculous press coverage is also what will enable it to become essentially perfect (like passenger air travel is today in the developed world).
Likewise, human drivers manage to avoid accidents every single day by paying attention, reading traffic signals and reacting to a situation.
Computers have to be 100% safe because self-driving cars take control away from the user. And where does the blame lie in an accident? With the owner of the vehicle, the 'operator' at the time of the accident, the 3rd party developers who were contracted by Tesla/Nissan/BMW, or the car company themselves? Getting customer service and a dealership to admit responsibility for parts failure as it is today is near impossible.
It's fine for people to be be pro self-driving cars but it is wrong to dismiss any and every criticism of an emerging technology on that basis alone.
its really really annoying that even smart people puppet 'self driving cars are safer than human ones'.
self driving cars are orders of magnitude more dangerous than human drivers. its absurd to say otherwise, and to do so requires a level of stupidity that can only be explained by dogma.
Given how bad humans are at driving cars safely, it is unfortunate that we are holding self-driving cars to a higher standard than we hold humans. It will cause needless loss of life.
I am also human driver and I feel safer around "dumb" cars. I don't ever intend to ride a self driving car, and will do my best to avoid being near one.
Aside from being vulnerable to malicious actors, self-driving cars make "cold-hearted" mistakes that don't make sense; mistakes that humans would never make. Sure, the total number of mistakes they make will be lower than the equivalent number of accidents caused by humans. However, having human life lost to the type of mistakes self-driven cars make is, in my opinion, an unacceptable price to pay.
If I must loose my life to a car-related indecent, I want the it to be caused by a human, and not some quirk or edge-case in an algorithm or neural net.
If you, as a human, do stupid things behind the wheel you are personally liable for the outcome.
Human drivers are orders of magnitude more capable AND liable than this autopilot joke.
If a machine advertised as SELF DRIVING does stupid things behind the wheel who do you hold accountable? Should this even be allowed when we know that the machine cannot handle all situations?
Humans make the wrong call on this sort of thing all the time. They also make stupid passes, yield the right of way at the wrong time, drive the wrong way down one way streets, cut each other off, fall asleep at the wheel, drive drunk, drive without their glasses on, get road rage, etc etc etc.
There is this sort of one-way lens when it comes to self driving cars. People want to throw up red flags about all things they might do wrong while ignoring the millions of stupid things that humans do to kill each other with cars every single day.
reply