Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Lots of people are arguing about 'when full self-driving cars will happen' without giving a precise definition (and therefore without having to think too hard about the problem).

The article describes Level 5 as 'full computer control of the vehicle with zero limitations' but what exactly does that mean?

(a) Does it mean a vehicle available for sale to the general public that can drive at night when it is snowing on a new road that has not first been mapped by a human?

or

(b) Does it mean that there is at least 1 city of a million residents where at least 50% of vehicle journeys did not require any occupant to have a driver's licence?

To me (and Volkswagen) it's far more useful to forecast when the second criteria will be met than the first.



sort by: page size:

Volkswagen doesn't develop self-driving tech. They use Mobileye's solution like everyone expect Tesla, Waymo and GM/Cruise. They aren't really in the position to make that argument.

Level 5 means that the car can drive autonomously > 95% of the time, and it will absolutely be possible. Level 4/5 distinction doesn't really make any sense once the autonomy gets beyond certain percentage. It doesn't mean that it's level 4 until it hits 100% (which is impossible).

Level 5 car is designed to drive in all conditions, but there are always statistically unlikely corner cases or situations that require high-level decision making, which the car can't handle by itself. A single driver may never hit such case, and for them the experience is full self-driving.


There seems to be an implicit assumption here that Level 5 = human = 100% of drivers. I honestly don't think that's the case at all. If I am being charitable, I'd say half of drivers would meet the implicit level 5 criteria discussed in these threads.

For example, there's a snowstorm out here today. Unless they really need to, people aren't going out displayed their incredible skill at navigating through snowsquals with centimeters of snow on the ground. They just stay home.

What will determine the success of self driving cars is not philosophical musings but their usefulness in day to day life. And if you can spend 10k on a system that'll work most of the time, but refuse to go out in snow squeals, it'll sell very well. I'd buy it.


Level 5 doesn't mean the driver can't take over - just that they should never have to. The driver will always have the ability to take manual control; something that won't change within our lifetimes. Even if a company manages to legally release a "level 5" car in the next 50 years, it will really only be "level 4.5" - 100% automation under all conditions isn't in the cards for the very near future.

It astounds me how naive and over-optimistic we are seeing enthusiasts of self-driving cars behave. The industry is in its infancy, but people talk about it like our roads have been exclusively filled with them for decades, and all challenges and bugs have been conquered. Surely every person who talks about these cars like it's a solved domain can't all be prospecting, bandwagon investors.


They won't be fully autonomous until we reach level 5, that is what level 5 is by definition. Whether or not a car that isn't fully autonomous but can handle a variety of common situations is completely useless to someone is an individual assessment. I personally look forward to the extra safety.

That's the problem with all of the fatuous interpretations floating around of "level 5" self-driving.

"It has to be able to handle any possible conceivable scenario without human assistance" so people ask things like "will a self-driving car be able to change its own tyre in case of a flat" and "will a self-driving car be able to defend the Earth from an extraterrestrial invasion in order to get to its destination".

They need to update the official definition of level 5 to "must be able to handle any situation that an average human driver could reasonably handle without getting out of the vehicle."

(Although the "level 1" - "level 5" scale is a terrible way to describe autonomous vehicles in any case and needs to be replaced with a measure of how long it's safe for the vehicle to operate without human supervision.)


That's right now and that's why companies have to provide level 5 autonomy or there won't be any customers. Yet for days with good weather in urban environments level 5 might come within a decade; it's still better to be able to drive only half of the time instead of all the time. The long tail of problems that is supposedly plaguing self-driving systems exists with humans as well - how would you react if a bridge/building in front of you started collapsing or somebody started overtaking right in front of you?

The level 5 of autonomy is defined by the fact that the car can handle those new and extreme situations. I think that what you mention would match level 4 autonomy.

For example, definitions from https://www.techrepublic.com/article/autonomous-driving-leve...:

> Level 4: This is what is meant by "fully autonomous." Level 4 vehicles are "designed to perform all safety-critical driving functions and monitor roadway conditions for an entire trip." However, it's important to note that this is limited to the "operational design domain (ODD)" of the vehicle--meaning it does not cover every driving scenario.

> Level 5: This refers to a fully-autonomous system that expects the vehicle's performance to equal that of a human driver, in every driving scenario--including extreme environments like dirt roads that are unlikely to be navigated by driverless vehicles in the near future.


Level 5 isn’t only about safety, but also about universality.

At level 5, one expects a self-driving car to ride gravel roads, park in highly temporary parking spots, spot police officers and follow their orders, drive short distances on non-roads (e.g. to drive around a car pile-up), etc.

A car that recognizes those cases, stops, and tells it’s passenger “please help me out for a few meters” would be a fantastic accomplishment and very, very successful, but wouldn’t qualify as level 5 autonomous.


It very much depends on what exactly we mean by 'self driving car'. The Wikipedia description of the Levels of autonomy doesn't help much. Level 5, steering wheel optional, gives the example of a self-driving taxi but we pretty much already have those in highly controlled environments. Assumptions about the situations such a car would be expected to be autonomous are critical to clear communication and analysis of this issue.

An autonomous car expert saying we are 30 to 50 years from fully autonomous vehicles by itself doesn't mean much when we actually have fully autonomous vehicles on some roads, for some tasks already[0]. Equally Musk wasn't saying we would have level 5 driving capability in all situations by 2020, he was just taking about highway cruising. The exact parameters of the driving situations we are talking about make all the difference. Both Musk can be right and the autonomous vehicles expert can be right depending on the context.

[0] https://www.reuters.com/article/us-einride-autonomous-sweden...


I think the key thing people need to realize from the SAE definition [1] of the levels is that they represent designs of the system rather than abilities of the system. I could slap a camera on my dashboard, tell the car to go when it sees green pixels in the top half of its field of view and stop when it sees red pixels. Then I could get out the car and turn it on, and for the 5 seconds it took for that car to kill a pedestrian and crash into a tree, that would be level 5 self driving.

So when people talk about a particular company "achieving" level 4 or level 5, I don't know what they mean. Maybe they mean achieving it "safely" which is murky, since any system can crash. Maybe they mean achieving it legally on public roads, in which case, it's a legal achievement (although depending on what regulatory hoops they had to go through, maybe they had to make technical achievements as well).

[1] : https://web.archive.org/web/20161120142825/http://www.sae.or...


No, you said that I changed the “actual definition” of self-driving from Level 5 to something else. I’m asking what makes Level 5 the “actual definition” and not just an arbitrary point that you’ve personally decided means self-driving?

I don't agree with the categorization of “full self-driving” as "LEVEL 2: EXAGGERATED CLAIMS".

There is no proof that self driving cars will ever be able to navigate in e.g. European cities with narrow streets and without lines on them.

Full driving would mean driving autonomous everywhere and between a highway with visible signs and lines and streets without any markings is a big difference.

its more like "LEVEL 3: UTOPIAN FUTURES" because there is the potential for it but no current systems can handle difficult situations in streets without clear markings.


It's actually debatable that you ever need level 5, because once you reach level 4 it's probably only a matter of time until most road infrastructure is converted for L4 self-driving vehicules (once you can statistically prove a self-driving road kills 1000x less than one with human drivers).

Like we never wondered about self-driving trains, essentially. It just became a fact once we could do remotely supervized L4, and that novelty was short-lived too.


I'm talking about fully autonomous vehicles (level 4/5).

Part of the problem is that I'm not sure the defined levels are necessarily a particularly good framework for thinking about the level of automation going forward.

-- What's a sufficient time margin for someone to take control?

-- How commonly might someone need to take control?

-- Can someone take control?

-- How broad is the use case?

It's pretty clear that we're pretty much at the point where a car can tool down at least certain highways in good weather with a (theoretically) attentive driver ready to take over if needed.

What's less clear is what the intermediate stages are between that and "don't need a steering wheel in any weather on a maintained road and can park in an unmarked spot" looks like.


The NHTSA originally designated a 0-4 system for classifying Autonomous vehicles. Then the SAE (engineers love adding features) added Level 5. When the NHTSA adopted the SAE system, they reworded the description of Level 5 to what you've quoted. And seriously, it's a meaningless designation. To have a driverless car that perform as well as an expert human driver in all conditions will take decades, it may require something approximating sci-fi grade AGI to pull it off. There are situations when driving that require creativity and generalist knowledge that is completely outside the bounds of what AI can do.

There's no point in discussing L5 realistically, nobody is anywhere close to pulling it off.


It doesn't need level 5 autonomy, and level 4 autonomy in managed conditions seems quite reachable. Also, in this situation, it's okay if the car decides that a problem is unsolvable as long as it stops safely - if it encounters weather or road conditions that it can't safely handle, the driver will handle it; if it needs interaction with cargo, police or refueling - the driver will handle it, etc.

Level 5 is so distant because it requires solving many problems that we haven't fully acknowledged yet; but Level 4 is different, major car companies (e.g. I recall a quote from Ford, probably Volvo as well) have stated that they don't ever intend to produce a level 3 autonomous car, that they want to go straight from level 2 driving assistance to level 4 cars since expecting a driver to monitor the situation all the time and be able to quickly take over (as level 3 requires) isn't realistic from a safety perspective, many drivers simply won't/can't do it.


If it comes down to ambiguity, it will be on the minutiae of what "Level 5" (full autonomy / no human in the loop) actually means in practice. Because in the world we live in, all these vehicles will be networked regardless. The marginal cost of having a human available for whatever fallbacks are needed approaches zero, which means that even if L5 were technically achievable there's little value in actually deploying anything like that.

But for sure count me with Carmack. My car is already driving me around my western suburban environment without trouble; zero-intervention drives are the rule now, not the exception. And it's been a long, long time since I've had to intervene for anything reasonably interpretable as a "safety" concern (mostly it's about stuff like navigation decisions, or the car is being a jerk at a merge and I don't want to be honked at, or it's being too slow pulling into an intersection and there's a lot of traffic, etc...).

Don't be fooled by the memery regarding Tesla. There are 60k+ FSD beta cars on the roads now, all of them with eager drivers wanting to post interesting failures to youtube. If these things were genuinely having trouble, we'd know.

It's not "done" for sure (in particular a lot of tuning needs to be done still), but it's at an effective level 3 already at least in my areas. And some work with remote recovery interfaces (something that Tesla doesn't seem to be focusing on) would get it to L4 in the kind of constrained environments Waymo and Cruise are operating in reasonably quickly, IMHO.


But in reality we don't even know how far we are from level 5 self driving car.
next

Legal | privacy