If Tesla are not careful with this, drivers of other vehicles will have serious reservations being anywhere around a Tesla. I have to say, I already do.
I will not stay behind or next to a Tesla if I can avoid it. I'll avoid being in front of one if the distance is such that I cannot react if the thing decides to suddenly accelerate or, while stopping, not break enough or at all.
In other words, I have no interest in risking my life and that of my family based on decisions made by both Tesla drivers (engaging drive-assist while not paying attention, sleeping, etc.) or Tesla engineering.
Will this sentiment change? Over time. Sure. If we do the right things. My gut feeling is program similar to crash testing safety will need to be instituted at some point.
A qualified government agency needs to come-up with a serious "torture" test for self-driving cars. Cars must pass a range of required scenario response requirements. Cars will need to be graded based on the result of running the test suite. And, of course, the test suite needs to include an evaluation of scenario response under various failure modes (sensor damage, impairment, disablement and computing system issues).
I am not for greatly expanded government regulation over everything in our lives. However, something like this would, in my opinion, more than justify it. This isn't much different from aircraft and aircraft system certification or medical device testing and licensing.
I don't think the issue is that Tesla's cars are dangerous. The issue people are raising is that they pretend, at least through implications, that their cars can safely drive for you.
Tesla is also not doing any kind of super special research into self driving cars. The system their cars use is (afaik) an OEM component (provided by MobileEye) that powers the driver assist features of many other brands, too.
Instead of actually improving the technology they have chosen to lower the safety parameters in order to make it seem like the system is more capable than it actually is.
The thing I fear now, is fearmongering by people who dont drive Teslas, havent spent time with Autosteer/autopilot and ignore that this was a freak accident, that probably would have killed the driver regardless of using autopilot.
The level-2 driving that Tesla is pushing seems like a worst case scenario to me. Requiring the driver to be awake and alert while not requiring them to actually do anything for long stretches of time is a recipe for disaster.
Neither the driver nor the car manufacturer will have clear responsibility when there is an accident. The driver will blame the system for failing and the manufacturer will blame the driver for not paying sufficient attention. It's lose-lose for everyone. The company, the drivers, the insurance companies, and other people on the road.
The problem with Tesla's "self-driving" is people assume they have to provide no oversight or inputs beyond engaging the system. It's a recipe for disaster. Couple it with the rapid acceleration and speeds the Tesla is capable of and you exponentially increase that risk. I have seen people shave / apply makeup, be on laptops, and even sleep while these cars were "driving." Conversely, I have seen people accelerate rapidly and unsafe, taking their previously gas-driven aggressive behaviors to a whole new level.
The cars need governors to keep other motorists safe and I hesitate to say "other motorists," since I think most find their driver's license as the toy in a Happy Meal.
You're pushing back against the wrong part of my comment. You don't believe Tesla is meeting some sort of bar (re: your reference to Subaru), but regulators have approved of Tesla's continued approach regarding driver awareness (periodic steering wheel torque dead man switch versus requiring eye tracking). So what evidence are you using that your approach should be required and would be superior? Drowsy and intoxicated people still drive every day, killing innocent people, without driver assist systems (roughly 100k accidents a year are from drowsy driving, and about 10k people a year are killed due to drunk drivers).
You cannot mitigate away personal responsibility failures with technology bandaids. At some point, the buck must stop with the operator. (On my large shop equipment I let others use in my workshop, there are large stickers that very loudly proclaim "Not only will this kill you, it will hurt the whole time you are dying").
We should measure accidents while testing compared to average, before having knee jerk reactions before implementing unnecessary regulations.
When I've seen YouTube videos of customers testing, they seem very responsible and to be honest I think it's a great idea in general and I'm usually very impressed with Tesla's progress seeing those unfiltered videos.
It really doesn't matter whether the driver should or should not be disengaging, there are many, many studies categorically proving that "allowing the driver to be mostly relaxed and not required, only to require immediate intervention in dangerous situations" is absolutely, empirically less safe. You can't just white wash it away by "oh well, it will get better". When? And don't mention a word about Elon's opinion on when. The guy has been promising "this year" every single year for nine years now. More realistic estimates have this a decade, or two, away, at the very earliest. And I have huge doubts that when it does, Tesla will be nowhere near it. Their phantom braking fiasco proves just how horrific Tesla's approach to testing is, throwing multiple releases out into the wild with less than 72 hours between them, for absolute safety features. Anyone who claims that those releases were subject to any form of rigor in testing whatsoever is deluded, and anyone claiming that testing it on the public roads is somehow acceptable is equally deluded.
I am very, very well aware of exactly what causes traffic fatalities. According to the software at my work, I have personally responded to 378 fatality MVAs as a paramedic. Please don't try to assume everyone is ignorant about realities - we are not blindered, and only physically capable of recognizing and responding to one danger at a time.
Would we try to stop it?
A lot of companies would like to stop it, that’s for sure.
There no doubt that a computer will match a human, than exceed its capability, 24x7, never tired, never distracted.
For the moment, every drive has to acknowledge that they are responsible, need to keep the control and have to stay vigilant every time they use self-driving. It seems to me that take precedence to the product brochure or random comment in a tweet a few years ago.
Of course this comment is going to be heavily downvote like everything positive about Tesla here. I wonder why?
I can guarantee you that if Tesla's auto-driving starts killing people, there will be near-instant regulation. For that reason alone, supporters of auto-driving technology should be aghast at what Tesla is doing.
I see a lot of people arguing from their intuition (or experience in other fields) that what Tesla is doing with their driving assistance features is dangerous. There are millions of Teslas on the road, shouldn't the first step be to find out check whether they crash more often in practice? If not, then what's the big deal?
Depends on the design of the vehicle. It would be silly to assume Tesla, which has historically created some of the safest cars it their class, would completely ignore driver safety.
You should not. I have Tesla FSD and without constant monitoring by a very cautious human driver it is a murder machine. The choice is not “human vs. Tesla” it is “human vs. dangerous Tesla mitigated by humans of varying degrees of competence.”
Society has decided how to judge whether I am safe to drive, and I have passed the tests.
I worry that the laws on the books are insufficient to judge whether a computer is safe to drive. If Tesla already has plans to increase safety, I'd like to know how they judge it, where they think they are, where they'd like to improve, etc.
What's wrong with the test? Does it not prove that Tesla's self driving will happily drive over a child, making zero effort to avoid it, and continue on it's merry way?
I will not stay behind or next to a Tesla if I can avoid it. I'll avoid being in front of one if the distance is such that I cannot react if the thing decides to suddenly accelerate or, while stopping, not break enough or at all.
In other words, I have no interest in risking my life and that of my family based on decisions made by both Tesla drivers (engaging drive-assist while not paying attention, sleeping, etc.) or Tesla engineering.
Will this sentiment change? Over time. Sure. If we do the right things. My gut feeling is program similar to crash testing safety will need to be instituted at some point.
A qualified government agency needs to come-up with a serious "torture" test for self-driving cars. Cars must pass a range of required scenario response requirements. Cars will need to be graded based on the result of running the test suite. And, of course, the test suite needs to include an evaluation of scenario response under various failure modes (sensor damage, impairment, disablement and computing system issues).
I am not for greatly expanded government regulation over everything in our lives. However, something like this would, in my opinion, more than justify it. This isn't much different from aircraft and aircraft system certification or medical device testing and licensing.
reply