Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Unfortunately you can't augment a level 5 self-driving car with human resources in an analogous way.


sort by: page size:

This resonates with me. The kind of company that makes a good human-driven car is so far away from the kind of company that can develop a level 5 autonomous system that to try to start up both at the same time would be simply multiplying an already low chance of success by another low chance of success.

How about developing the world's leading level 5 system and then licensing it to the all auto companies?


Level 4 autonomous vehicles are still years away. Level 1 - 3 cars can't legitimately be considered "self-driving" in any real sense.

I don't think the company that built Apple maps can build a level 5 autonomous car.

Better technology can't patch human issues unless you're able to remove the human. Self driving cars are still not here.

But in reality we don't even know how far we are from level 5 self driving car.

Which can't be fixed before lvl 5 self-driving.

The thing is, I see no way to have full self driving without AGI. And I don't think humanity is anywhere close to developing AGI. Without AGI you can have level 4++ self driving, maybe, but not level 5.

Even if the computers are good enough, they don't have the software to make it work. Realistically they're still many years away from software that could legitimately deliver Level 4 autonomous driving on every road in the US.

No one working on autonomous vehicles is trying to build something where, long term, you need a role like that. There's no benefit to that over just hiring a driver.

It’s funny how unpopular this argument was. There is no way level 5 self driving works, unless you are aware of anomalies beyond what the models are prepared for.

That's right now and that's why companies have to provide level 5 autonomy or there won't be any customers. Yet for days with good weather in urban environments level 5 might come within a decade; it's still better to be able to drive only half of the time instead of all the time. The long tail of problems that is supposedly plaguing self-driving systems exists with humans as well - how would you react if a bridge/building in front of you started collapsing or somebody started overtaking right in front of you?

This is a silly argument because humans are autonomous and cannot drive in all environments safely. Level 5 means matching human performance, so the cars can still get stuck and require a tow, if we reach level 5.

I am using the word autonomous as defined by the SAE levels, which are described as “levels of autonomy”. You are yak shaving and contorting the words to your liking.

Also tesla is level 3, not level 2. It also doesn’t require hands on wheel 100% of the time, although they say it requires this


Level 5 doesn't mean the driver can't take over - just that they should never have to. The driver will always have the ability to take manual control; something that won't change within our lifetimes. Even if a company manages to legally release a "level 5" car in the next 50 years, it will really only be "level 4.5" - 100% automation under all conditions isn't in the cards for the very near future.

It astounds me how naive and over-optimistic we are seeing enthusiasts of self-driving cars behave. The industry is in its infancy, but people talk about it like our roads have been exclusively filled with them for decades, and all challenges and bugs have been conquered. Surely every person who talks about these cars like it's a solved domain can't all be prospecting, bandwagon investors.


Well they wouldn't be the world's best self driving engineers in that case would they?

The issue is that building autonomous (not driver needed at all) is way more difficult than building self-driving cars which is pretty difficult by itself.

The self driving cars aren't yet at ordinary human levels of competence

Humans are a pretty poor baseline.

A fully autonomous driving system that was only as good as a human would not gain much traction and the company would be on the hook for some serious legal damages if it became popular.


Level 5 isn’t only about safety, but also about universality.

At level 5, one expects a self-driving car to ride gravel roads, park in highly temporary parking spots, spot police officers and follow their orders, drive short distances on non-roads (e.g. to drive around a car pile-up), etc.

A car that recognizes those cases, stops, and tells it’s passenger “please help me out for a few meters” would be a fantastic accomplishment and very, very successful, but wouldn’t qualify as level 5 autonomous.


It is not clear that this particular autonomous driving system is better than a human driver.
next

Legal | privacy