Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

SAE level 5 is unlikely to happen since the autonomous cases will be developed and optimized for the common case of a semi-clear weather mostly urban environment with plenty of signs and full mobile and GNSS coverage. The odd cases will simply not be a priority.

Can one take an "autonomous car" up a barely visible car wide path (not a real road) which squiggles through a forest (up to a cottage)?

Can one make an autonomous car understand a free-form textual sign when there's a roadwork or accident?

Drive in a place without marked roads?

There are plenty of edge cases and difficult situations.

Hitzinger says that level 4 might be achievable. I agree, it is conceivable that some day a lane of a motorway could get reserved for semi-autonomous vehicles; those vehicles can communicate with each other and are allowed to drive much faster (say, 250 km/h) since it's mostly a straight way, computers have faster reaction times especially given early warnings from cars ahead in the chain and so on.



sort by: page size:

It doesn't need level 5 autonomy, and level 4 autonomy in managed conditions seems quite reachable. Also, in this situation, it's okay if the car decides that a problem is unsolvable as long as it stops safely - if it encounters weather or road conditions that it can't safely handle, the driver will handle it; if it needs interaction with cargo, police or refueling - the driver will handle it, etc.

Level 5 is so distant because it requires solving many problems that we haven't fully acknowledged yet; but Level 4 is different, major car companies (e.g. I recall a quote from Ford, probably Volvo as well) have stated that they don't ever intend to produce a level 3 autonomous car, that they want to go straight from level 2 driving assistance to level 4 cars since expecting a driver to monitor the situation all the time and be able to quickly take over (as level 3 requires) isn't realistic from a safety perspective, many drivers simply won't/can't do it.


Not just that, getting self-driving approved with regulators is a major challenge. Also, to achieve level 5 we need not just algorithm improvement, our physical infrastructure need to be improved as well. On intersection, two human drivers wave at each other to decide who goes first. How should the human driver communicate with the machine? What should the machine do when there is a dog standing in the middle of a street? Move on and let a human driver take care of it? Isn’t that now a bystander effect? These are just some examples I am not sure if textbook definition of level 5 will address.

Going really extreme: in a car accident in a somewhat remote area, the driver might be able to get an injured passenger out if the passenger stuck. Now a fully autonomous car, there is only passenger. Yes, for sure this is extreme and perhaps likely a dramatic scene only, but it is one of those edge cases we can’t cover in a self-driving car.


Level 5 isn’t only about safety, but also about universality.

At level 5, one expects a self-driving car to ride gravel roads, park in highly temporary parking spots, spot police officers and follow their orders, drive short distances on non-roads (e.g. to drive around a car pile-up), etc.

A car that recognizes those cases, stops, and tells it’s passenger “please help me out for a few meters” would be a fantastic accomplishment and very, very successful, but wouldn’t qualify as level 5 autonomous.


I just recently left the auto industry. And I should lead with the fact that I was not ever directly involved with the autonomous driving development. That said, I don’t think anyone is remotely near level 5, and I’m frankly doubtful that level 4 will ever really ship. I don’t doubt that there are some really smart people working on this, but I think this is only ever going to be something that works on clear days with well-marked roads.

We need a level between Level 4 (the vehicle can operate autonomously in almost all conditions) and Level 5 (no manual control): Level 4.5, wherein the vehicle and operate autonomously in almost all conditions and can identify and safely stop in other conditions.

If there's a severe blizzard and my car can't see the road in front of it and we're in a 4G dead zone and it can't get an accurate enough GPS fix to figure out where it's going, I'd be perfectly happy if it safely pulled over to the side of the road and stopped. Heck, I'd prefer to have it safely pull over rather than trying to continue based on some premise of "this vehicle be able to navigate under all possible conditions".


Autonomous driving systems are set at various levels of autonomy.

Level 0 is no automation, level 1 is just a dumb cruise control, level 2 is radar adaptive cruise control plus lane keeping (which is where most production systems like Tesla Autopilot and GM Supercruise are currently at). Level 2 still requires full human supervision, if you engaged it on the road above it would either fail to engage or you'd crash and it would be your fault. Level 3 is the same plus an ability to handle some common driving tasks, like changing lanes to pass a slower vehicle.

Level 4 is where it gets really interesting, because it's supposed to handle everything involved in navigating from Point A to Point B. It's supposed to stop itself in the event of encountering something it can't handle, so you could theoretically take a nap while it drove.

However, an important limitation is that Level 4 autonomy is geofenced, it's only allowed in certain areas on certain roads. Also, it can disable itself in certain conditions like construction or weather that inhibit visibility. Waymo vehicles like these are ostensibly level 4, if you tell them to drive through a back road in the snow they'll simply refuse to do so. It's only useful in reasonably good conditions in a few big cities.

Level 5 is considered to be Point A to Point B, for any two navigable points, in any conditions that the vehicle can traverse. You could build a Level 5 vehicle without a driver's seat, much less an alert driver. I kind of think this will require something much closer to artificial general intelligence; level 4 is just really difficult conventional programming.


The NHTSA originally designated a 0-4 system for classifying Autonomous vehicles. Then the SAE (engineers love adding features) added Level 5. When the NHTSA adopted the SAE system, they reworded the description of Level 5 to what you've quoted. And seriously, it's a meaningless designation. To have a driverless car that perform as well as an expert human driver in all conditions will take decades, it may require something approximating sci-fi grade AGI to pull it off. There are situations when driving that require creativity and generalist knowledge that is completely outside the bounds of what AI can do.

There's no point in discussing L5 realistically, nobody is anywhere close to pulling it off.


I'm not involved in autonomous vehicles at all but as an engineer, I'm pretty confident we are a long way away from level 4/5.

Without significantly reworking current infrastructure the challenges presented by city driving are too fuzzy to guarantee safety. For example, traffic lights can be obscured, out of order, or spoofed; unless they are modified to support av; I don't think this is solvable.

Although autonomous vehicles will require a safety driver for a while, interim benefits still exist and services like this will potentially provide the momentum needed to make those infrastructure changes.


Volkswagen doesn't develop self-driving tech. They use Mobileye's solution like everyone expect Tesla, Waymo and GM/Cruise. They aren't really in the position to make that argument.

Level 5 means that the car can drive autonomously > 95% of the time, and it will absolutely be possible. Level 4/5 distinction doesn't really make any sense once the autonomy gets beyond certain percentage. It doesn't mean that it's level 4 until it hits 100% (which is impossible).

Level 5 car is designed to drive in all conditions, but there are always statistically unlikely corner cases or situations that require high-level decision making, which the car can't handle by itself. A single driver may never hit such case, and for them the experience is full self-driving.


Level 4 autonomous vehicles are still years away. Level 1 - 3 cars can't legitimately be considered "self-driving" in any real sense.

I'm talking about fully autonomous vehicles (level 4/5).

It's actually debatable that you ever need level 5, because once you reach level 4 it's probably only a matter of time until most road infrastructure is converted for L4 self-driving vehicules (once you can statistically prove a self-driving road kills 1000x less than one with human drivers).

Like we never wondered about self-driving trains, essentially. It just became a fact once we could do remotely supervized L4, and that novelty was short-lived too.


They won't be fully autonomous until we reach level 5, that is what level 5 is by definition. Whether or not a car that isn't fully autonomous but can handle a variety of common situations is completely useless to someone is an individual assessment. I personally look forward to the extra safety.

I think the key thing people need to realize from the SAE definition [1] of the levels is that they represent designs of the system rather than abilities of the system. I could slap a camera on my dashboard, tell the car to go when it sees green pixels in the top half of its field of view and stop when it sees red pixels. Then I could get out the car and turn it on, and for the 5 seconds it took for that car to kill a pedestrian and crash into a tree, that would be level 5 self driving.

So when people talk about a particular company "achieving" level 4 or level 5, I don't know what they mean. Maybe they mean achieving it "safely" which is murky, since any system can crash. Maybe they mean achieving it legally on public roads, in which case, it's a legal achievement (although depending on what regulatory hoops they had to go through, maybe they had to make technical achievements as well).

[1] : https://web.archive.org/web/20161120142825/http://www.sae.or...


The level 5 of autonomy is defined by the fact that the car can handle those new and extreme situations. I think that what you mention would match level 4 autonomy.

For example, definitions from https://www.techrepublic.com/article/autonomous-driving-leve...:

> Level 4: This is what is meant by "fully autonomous." Level 4 vehicles are "designed to perform all safety-critical driving functions and monitor roadway conditions for an entire trip." However, it's important to note that this is limited to the "operational design domain (ODD)" of the vehicle--meaning it does not cover every driving scenario.

> Level 5: This refers to a fully-autonomous system that expects the vehicle's performance to equal that of a human driver, in every driving scenario--including extreme environments like dirt roads that are unlikely to be navigated by driverless vehicles in the near future.


Level 5 isn't the only safe level. Level 4 is safe too

I agree that, by definition, this is necessarily true.

The catch I see is that the same definition is predicated on the vehicle being able to safely end the journey before entering any unsupported situation, without requiring any driver interaction. I'm not aware that we have any known strategy for solving that problem in the general case that would not achieve level 5 anyway.

I acknowledge that in specific situations like geofencing, where a vehicle does effectively operate at level 5 but only under predetermined conditions, that would be level 4 according to the scale. However, it's the ability to operate fully autonomously, albeit within those boundaries, that makes the vehicle safe in this scenario.

So, what happens if external conditions (for example, directions by a police officer, or some sort of road accident or severe weather) mean that the vehicle cannot safely remain within the area where it can operate autonomously? Unlike a vehicle with a human driver, it cannot adapt and safely leave that area either.

In short, unless perhaps we're also going to have a new set of rules and possibly some separated infrastructure for use with level 4 vehicles, I'm not sure they can ever fully match the safety of a human driver without necessarily reaching level 5.


Agreed, level 5 autonomy is an ever shrinking number of edge cases. Human drivers also have edge cases like being drunk or otherwise incapacitated by strokes, heart attacks, getting phone calls, old, tired, etc. We still allow them on the road despite this and the notion that those things are some of the root causes of the many deadly accidents each year. Once autonomous cars are obviously safer than that, it will become the norm. We're not that far off from that. First we'll see mass deployment of level 3 & 4 first with safety drivers and when it becomes clear that those are a liability at best, also without. From there to level 5 is a matter of semantics since we'll basically have vehicles driving themselves most of the time.

Whoever come up with a this levels in autonomous driving (L1-L5) must be marketing genius. Ideally it should have been Yes or No answer to "does your car support autonomous driving ?", just like alive or dead. Right now, it is just another reason in list of Road accident deaths, till it perfected.

That's right now and that's why companies have to provide level 5 autonomy or there won't be any customers. Yet for days with good weather in urban environments level 5 might come within a decade; it's still better to be able to drive only half of the time instead of all the time. The long tail of problems that is supposedly plaguing self-driving systems exists with humans as well - how would you react if a bridge/building in front of you started collapsing or somebody started overtaking right in front of you?
next

Legal | privacy