if it was the case self driving cars wouldn't be on the road, I don't think we should aim for perfection, perfection will come. We should be looking for cars that make less errors on average than humans, once you have that you can start putting cars on the road and use data from the fleet to correct remaining errors.
> So how do we decide when the automated car is 'good enough'?
This is actually a really interesting point. I don't think people appreciate how far accident rates have actually dropped for modern cars without self driving. Even at million-cars-per-year sales rate you will need years of data to prove that a single self-driving software+hardware combo is better than humans with high statistical confidence. Your development cycles would be decades-long, like in aviation, if you want to be sure you're actually improving.
> But must the standard for safety be higher than existing human drivers?
> We're net ahead already with auto-driving.
We don't really have enough data to know that. Really, we haven't even scratched the surface of figuring out if that is true. E.g. we have no idea what kind of emergent phenomena will arise when roads full of self-driving cars interact with each other.
History is littered with examples of engineers thinking they were 90% of the way there when they were actually like 10% of the way there.
>Most of us are going to expect nothing short of perfection from these machines to really trust them.
Now, expecting perfection, and especially giving up 'better than what we have' in exchange for perfection is a bad thing, but but I do think it's completely reasonable to expect the self-driving cars to be way better than an average human, even better than the best humans, just because we lose so many people to auto accidents every year.
I think it's completely reasonable to want to reduce the risks associated with transport, and I think the only politically possible way to do this is with self-driving vehicles, because I don't think it's politically possible to remove the worst half of drivers from the road without offering them similar levels of mobility.
> This is why I am afraid of the hype behind self-driving vehicles.
Well, they don't have to be absolutely correct. Just a lot more correct than human drivers across a broad range of driving scenarios to justify replacing human drivers.
I'm not saying you're wrong because of that. I just wonder how far from "ready" we are, and how much of a gamble manufacturers are taking, and how much risk that presents for not just their customers, but everyone else their customers may drive near.
>> If we can have self-driving vehicles that are twice as safe as the average human, then they should be allowed on the streets.
Absolutely disagree with this approach. Self driving cars can't be merely slightly better than your average human - they need to be completely flawless. Just like an airplane autopilot can't be just a bit better than a human pilot, it needs to be 100% reliable.
> It doesn't need to be perfect. It only needs to be better than humans as an average.
Sadly I don't think this will be true. The problem is that if a person causes an accident only that driver is at fault. If a self driving car causes an accident it follows that all similar cars would have behaved the same, so they are all considered to be at fault. That will only be mitigated if they drive better than humans.
> It seems to me that self-driving cars can't be just as safe as the status quo, but have to be far far better. It's an unreasonable need, but human nature.
They need to be safer than me, not the average driver. I suspect the lower 50% of drivers cause > 50% of the collisions.
Conversations I've had where people have told me that self-driving cars will need to be 100% perfect before they should be used. Ironically, one of those people was an ex-gf of mine who caused two car accidents because she was putting on makeup while driving.
Anyway, based on Google's extensive test results, I'm pretty sure self-driving cars are already advanced enough to be less of a risk than humans. Right now, the sensors seem to be the limiting the factor.
> Fully self driving cars are far further off in the future than most in the sector would publicly admit.
Full self driving have NOT been solved for highways in a desert.
This is literally the easiest real world scenario imaginable. Highly and strictly regulated environment, and no obscurities like snow happening. I don't think there is a hope we will get any close to self drive with current technology.
Is the sudden object in front of me a block of concrete that fell of or stray page from newspaper floating by on a wind gust? Hard bake or ignore it?
We cant possibly capture all edge cases of real life. You need systems that have cognitive capabilities to deal with real life
Or optionally different type of transportation that is not so complex to navigate and understand. There are already fully automated metro lines. Maybe solution is to change the problem itself.
The problem is that will not be the expectation. It will need to be perfect or near perfect.
full self driving cars can drive a million of miles with no accident, but if on 1,000,001 miles there is a accident with a death it is all over the news, and people proclaiming the technology is terrible, unsafe and not ready. In that same 1,000,001 miles humans have caused far more damange and death but that is just "normal" so....
This is key, there's expectation and some wiggle room that as a human driver, humans will fuck up predictably and experienced drivers know how to avoid getting into incidents when this happens (usually).
Self-driving cars are weird to drive around. They will absolutely stop in situations where no human would think to stop. I think about this as a motorcycle rider, what if I'm committed to cornering on a corner I can't see around and the software decides on a self-driving car that it should just stop in the middle of the road after the apex? A human driver could do this too but many will know that this is a dangerous place to stop and try to put the car on the shoulder or minimize the amount of time it's stuck there.
I don't know if this is something we need to tolerate a temporary increased incident rate on as people get used to them being on the road, or if we need to make the software drive more like humans (with the assumption that means potentially making the behavior act sloppier than it can handle so that increased software reaction rate doesn't cause humans with slow reaction rate to slam into them)
> I agree that self-driving had/have been overhyped over the previous few years. The problem is harder than many people realize.
The current road infrastructure (markings, signs) has been designed for humans. Once it has been modernized to better aid the self-driving systems, we don't probably need "perfect" AI.
> I feel self driving cars need to do everything humans do and then 10x better to succeed
I hope that's not the case. Even if self-driving cars are only as good as the median human driver, a universal rollout of that tech would save countless lives. I really hope that when the first self-driving fatality happens the media or public fear doesn't blow it out of proportion.
if it was the case self driving cars wouldn't be on the road, I don't think we should aim for perfection, perfection will come. We should be looking for cars that make less errors on average than humans, once you have that you can start putting cars on the road and use data from the fleet to correct remaining errors.
reply