Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

As long as the U.S. personal-injury legal system remains in place (with strong support from media coverage of fatal crashes), it won't be enough for self-driving systems to have lower death rates/mile in aggregate. They will need to be better or equal to human performance in every subcategory of driving. That's really hard.

To wit: It's not enough to say "Our self-driving cars avoid 14,000 drunk-driving, texting and asleep-at-the-wheel fatalities," if it's also true that: "Our self-driving cars hit and kill 30 errant pedestrians a year that a human driver would have noticed."

I know that 14,000 > 30. But the specter of roadway martyrs being murdered by killing machines needs only a few examples to sustain itself.

Tough problem to solve.



view as:

They don’t need to match in every category, they just need to match in one category and then detect when they’re in that situation.

It will be a gradual transition as they learn how to handle more and more situations.


Legal | privacy