Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Level 4 self-driving seems the second-least desirable after Level 0, as it's entirely passive but requires full active attention. It's a ripe situation for distraction, something Tesla cars attempt to combat with the "wheel-squeeze".

From a cynical perspective, it's easy to see a future where we constantly approach, but never quite reach, full Level 5 autonomy - or at least not with existing technology. Maybe Level 5 will prove to be an NP problem. Marketing teams are likely going to promote terms like "Level 4.1 self-driving" or "Level 4.5 self-driving" or "Level 4.9 self-driving", as edge-cases are slowly resolved.



sort by: page size:

I worked in Automotive for 7 years until late last year, though never directly on Autonomous Driving.

It is my belief (opinions are solely my own) that level 4 driving is 10 years away at best, and true level 5 will never ever happen with any manufacturer. I think Tesla has done well in the space of demonstrating self-driving, but Musk’s continued promises of full self-driving have gone from eyebrow-raising to eye-rolling.

I just sincerely doubt the magical future we’re hoping for will arrive.


Level 5 self-driving will never be delivered on models that Tesla delivers now.

Their cars are attractive but they will lack the hardware for level 5 when it comes, and anyway will most likely be past their end-of-life then.


I should hope that “all the hardware is there for level5” also means that it operates as a very good level 2 until Tesla enables autonomy - which happens when software and approvals are ready - which may be a decade.

If they sell something now that has hardware for level 5 but neither software nor approval for level 5, but they weasel-maket it as “autopilot” and users can enable a mode where they can let go of the wheel - then they are just criminal.


Numerous times we were promised 'Level 5' self driving cars last year:

> Tesla 'very close' to level 5 autonomous driving technology - Musk says [0]

> Level 5 Self Driving Cars this year (2020) [1]

Now, it is admitted that it is Level 2. So its not 'Full Self-Driving' as promised.

When will the Elon fanatics stop listening to his bullsh*t?

[0] https://www.reuters.com/article/us-tesla-autonomous-idUSKBN2...

[1] https://www.forbes.com/sites/johnkoetsier/2020/07/09/elon-mu...


There are several levels of driving automation, from 0 (warning system) to 5 (completely robotic taxi) But in exec speak it goes: self driving, to truly self driving, to really truly self driving, to seriously self driving, to this time it's self driving for realz, and so on...

Teslas are considered level 2/3, and since he said the car will be able to pick you up in a parking lot...I guess he means level 5!? I'm doubtful they could make such a qualitative jump in one year


As someone who owns a Tesla, I think a lot of people want level 3 even if they don't realize it. Obviously if level 4 was available, people would go for it, but perfect is the enemy of good, and even with Tesla's L2 technology, it makes driving significantly easier. Continuously being ready to take over control vs controlling is a huge leap already.

Tesla is marketing it as a level 5 system. They don't directly say that of course, but they are implying strongly it is level 5 and that is enough.

It doesn’t seem at all guaranteed that Tesla can ever achieve Level 5 autonomy, in any timeline, using the hardware deployed into Tesla vehicles today which were sold with “Full Self Driving” (FSD) as the $$$ expensive upgrade option.

I doled out ~$5000 for FSD on a Model 3 Performance bought 2018 which has now had a new computer fitted (at no cost to me) and always considered FSD pretty much a “bet”, which is not IMHO a complete loss since the HW3 upgrade got me more Level 2 features on a 2018 MY vehicle, something which is quite unprecedented in the auto industry.

I suspect Elon may be wrong about LIDAR and for any manufacturer to get to Level 5, it may be required and cannot be replaced entirely by cameras and computer vision. Since it is not present on any Tesla today and cannot be sanely retrofitted, barring what feels like a miracle to me, the “Defects per Million Miles” (e.g. disengagements, and accidents leading to property damage or non-fatal injury) and the “Fatalities per Million Miles” (where one was enough to tank others Autonomous Driving programs) will probably continue too high for public and regulatory acceptance that Tesla has attained Level 5 autonomy with their current vehicles and their current hardware.


One thing you can bet on is that virtually every carmaker is over-hyping their technology. Expect "Level 5" tech to be more like Level 4, "Level 4" to be more like Level 3, and so on.

I don't even know how they can even claim Level 5 (looking at you Nvidia) when their systems don't involve real-time training, but just do inferencing. I'm talking machine learning training not "it's gathering data as it drives training". You know what that means - it means the cars's autonomous systems need to have been trained for the roads on which they are driving or at least for those conditions ahead of time, somewhere in the cloud.

But Level 5 is supposed to work on ANY ROAD ANY TIME. I doubt they have already trained their systems for ANY ROAD ANY TIME and anywhere in the world. That seems very unlikely to me. In other words, Nvidia, and Tesla, and whoever else is promising Level 5 autonomy is lying to you.

And because of this stupidity and greed, the government will probably have to come up with Level 6 and Level 7 in the future, to designate to Level 7 what Level 5 should have been from the beginning. Think about how the US carriers started abusing the "4G" marketing in the early years. I very much expect the same to happen with self-driving cars, all for the sake of misleading marketing.

I also believe none of these players take software security that seriously. Some will at least implement the best practices, but won't go beyond that, most probably won't even do that (Google how stupid carmakers are being about remotely unlocking their doors over the internet for instance...). Unfortunately, this won't be noticed until maybe 7-10 years after millions of these cars are deployed. And then the hacks will start coming (including terrorist attacks using them).


Hardware performance is not a problem for Level 5 autonomy - the software is. If Tesla insists on deploying full self-driving capability in the next couple of years, they will be litigated out of existence. We are a few decades away from autopilot to "understand" what it is doing. Right now it is just parroting the most common scenarios. This may be as good or slightly better than the average driver, but it still will result in many deaths, if deployed in hundreds of thousands of cars. Unless Tesla somehow shields itself from legal liability, it will be sued to oblivion.

I don't even think level 5 at individual car level is needed. Tesla can operate their fleet like a Amazon does with their warehouse robots.

I keep hearing this but at the same time other manufacturers rolled out level 3 and level 4 autonomy officially, while Tesla is officially stuck at level 2 (that's from Tesla, not from me). I don't buy it until they (Tesla) officially roll out a level 4 system.

Does anyone believe the currently available Tesla vehicles out on the road will be level-5 "fully autonomous" without major hardware upgrades?

None. And this is exactly the problem: Tesla might have a nice piece of silicon there, but that doesn't solve the problem of still being far, far away from having a working algorithm for level 5 self driving. They have a supercomputer in their cars, but can only run minesweeper on it, so to speak.

Their hardware (maybe, at best) solves one of the easy pieces of the self-driving puzzle. It doesn't get them any closer to solving the hard parts. But it sure helps from a marketing perspective, which is kind of important if you want to continue selling a $6000 feature that essentially is just a promise for the future and thus requires buyers to "believe".


I have [said this before](https://news.ycombinator.com/item?id=19446043 ); Tesla is not on track to deliver Level 5 with their current hardware and apparent software strategy. Piloting a car is a very hard problem, but I would say that Tesla is still struggling with the easy parts of it— they should have solved the "never hit any obstacle" problem to superhuman levels; I think that's pretty comfortably solvable with present technology and algorithms. Instead a Tesla feels about as good in that respect as a careful student driver.

The harder parts of machine-piloting involve interpreting the world (lane-finding in the absence of striping; distinguishing hard obstacles from ephemeral objects like paper, leaves, weather, reflections, and so on); and participating in a social environment (what driving postures indicate what intents? What do the local humans consider rude or polite? What are they expecting?)

(A truly advanced autopilot would seem almost magical to a human— it would be able to back-solve diffuse and specular reflections in the environment, and accurately deduce the presence of oncoming cars around corners that would be totally invisible to a human driver. It could predict accidents several seconds before they begin to play out, and the sudden braking/maneuvering in the absence of any visible threat would seem strange and confusing to the humans inside).

Tesla is nowhere near cracking these things; they're still on the ground floor. I still predict that current pre-buyers of so-called "full self-driving" are not going to get what they were promised, and going back on that will probably be bad both for Tesla's reputation and their wallet. If I had to guess, I'd say that if Tesla delivers on Level 5, it will be with a hardware platform that doesn't exist yet, and a software platform that is close to "complete re-write".

As it stands, their misleading marketing on self-driving capability is both dangerous and borderline fraudulent.

All that being said, I'm long on self-driving as a technology; just short on Tesla's current approach to it. It's a shame because I suspect both Waymo and Cruise are doing significantly better technologically, but I trust in their ability to execute a product significantly less. I'm not sure who will actually crack it.


Watching recent Tesla FSD beta drives made me understand how stressful/dangerous it is to take a ride in such vehicle.

If they make a 1000x improvement it becomes even more dangerous because it can make you feel safe and unaware of sword of Damocles hanging over your head.

I hope I will be proven wrong, but I don’t think any of the current approaches will deliver level 5 autonomy.


And it won't come close to Theranos. Tesla makes real products that are class-leading.

Class-leading in what sense(s)?

Even if Tesla can't reach level 5, it will be damn close

But that's the problem with self-driving cars. Damn close isn't good enough. A miss is as good as a mile.

The problem with the self-driving/automation scale is that anything around levels 2-4 probably shouldn't be allowed on public roads, at least not yet.

Basic driver aids, where the driver is always fully engaged but the system can help to avoid mistakes, are proven to improve safety. This is what you get at level 1, and such technologies are already widespread in the industry.

If we can ever make a fully autonomous vehicle that can genuinely cope with any driving conditions, so you don't need any driver or controls in the vehicle any more, then obviously this has the potential to beat human drivers. This is level 5. But we don't know how to do this yet, and I have seen absolutely no evidence so far that anyone will know how to do it any time soon either.

In between, we have several variations where a human driver is required for some of the monitoring and control of the vehicle but not all. This has some horrible safety implications, particularly around the transitions between human- and vehicle-controlled modes of operation, and around creating a false sense of security for the human driver. The legal small print will probably say that they must remain fully alert and able to take over immediately at any time, but whether it is within human capability to actually do that effectively is an entirely different question.

and make driving 10-100x safer than just a human.

I've been driving for more than 25 years, and racked up hundreds of thousands of miles behind the wheel. I've never caused an accident, as far as I'm aware. I've never had a ticket. I try to be courteous to my fellow road users and give a comfortable ride to any passengers I have with me. What, in your opinion, would driving 10-100x safer than mine look like?

Humans certainly aren't perfect drivers and we have plenty of variation in ability. Things can go wrong, and I'm sure we'd all be happy to see fewer tragedies on our roads. But given the vast amounts of travel we undertake and how many of us do drive, autonomous vehicles will need an extremely good record -- far better than they have so far -- to justify the sort of claim you're making here.


With respect to the Bolt, I haven't heard anything about level 4 autonomous driving capabilities. That's a feature I imagine many future Model 3 owners are willing to wait for.

Tesla cars, with the sensor packages they currently have, do not have a path forward to Level 5, despite the marketing claims of Tesla. Claiming that their cars have all the hardware required for Level 5 is a flagrant lie, which they can get away with for now because they've given themselves plenty of outs to avoid ever having to deliver Level 5 capabilities (when they fail they can blame inadequate software or an unsuitable regulatory environment.)

A word on that software: As it currently stands, they are delivering software that enables Level 2 capabilities. This is sometimes called "hands on", as in your hands should remain on the steering wheel and your eyes on the road, ready to take control in an instant. According to Tesla, drivers are to keep their hands on the wheel and pay attention to the road; if the driver fails to do so then they are at fault for any accident. However Elon Musk contradicts this company policy and has promoted the system as hands off on national television. Why would he do something so irresponsible? Because misrepresenting the hardware and software capabilities of his cars helps him sell cars. He knows the hype for self driving cars is at a fever pitch, and stretching the truth helps him profit from that hype.

https://www.businessinsider.com/elon-musk-breaks-tesla-autop...

next

Legal | privacy