What they were producing 8 years ago and what they produce nowadays is an order of magnitude different. It’s like comparing your PC from 8 years ago to today’s PCs.
True, but a few years ago they had a huge problem with Model3 and ModelY quality- you had to be very careful and inspect the whole car for serious problems before accepting delivery.
lol if you look at any Tesla forums, this is still very much the case with all cars they make. The refreshed Model X seems to be hot garbage from a quality control stand point.
Except it's not. If there were no defects, there would be no defects reported on the forums. The number of cars sold is public.
The parent isn't suggesting looking at the ratio of happy and sad posters, which would definitely be selection bias, they are saying look at the rate and severity of complaints on it's own.
I'm not sure that the differences are that huge. AP is more capable, but also has more phantom braking issues.
The cars themselves have more range and charge faster, but its not an order of magnitude. Maybe charge times on a road trip have dropped by ~50%.
Service has generally gotten worse as well, as the company has really struggled to scale. They've added a lot of mobile service and service centers, but they still can't keep up with the pace of growth. Good thing is that most of the newer cars aren't needing service much, but when they do it can be problematic.
Tesla ranked 27th out of 28 car companies on Consumer Reports 2021 reliability rankings. They also ranked 30th out of 33 in JD Power's measurement of how many first-year issues new cars had.
I can see that "Autopilot" is arguably a marketing term. I don't like it, I feel like it's irresponsible, but I can see the argument that it's not false advertising.
It's hard to see ambiguity about "Full Self Driving". What's next, will they argue that "The Car Drives Itself, You Can Take a Nap" isn't meant to be taken literally?
“Full self driving” isn’t just ambiguous, it’s fraudulent.
The car can in no way fully ‘drive’ itself unless you define ‘driving’ as not including many things that are universally thought of as necessary skills to operate a car.
[edit: rereading your comment, I see we’re actually in agreement!]
On Tesla's website under the Autopilot page, there is a video that opens up with "The car is doing all the driving. The driver is just there for legal purposes"
The absurdity of that point becomes obvious when you look at it from a business perspective: Tesla’s marketing department wants that there because it sounds better than what their competitors offer. They wouldn’t make it so prominent if they didn’t want you to take exactly that impression.
Buying FSD now is really just buying a promise that FSD will be delivered in the future.
Whether Tesla will actually achieve FSD is debatable. Personally, I think they will. Yes, even with vision-only. It's going to be several more years, but I think they'll get there eventually.
But what do you call it when you're selling that feature to customers today?
Especially if it’s tied to the vehicle. How many people who paid for FSD originally will have sold their car or had it reach the end of its service life by the time anything like the original promise is delivered?
who cares what they call it. if you buy a software package thinking it's fsd in 2022 with zero research, it's on you. i feel like half the downvoters in this chain are fighting the good fight for .00001% of tesla owners, who happen to be some of the dumbest consumers on planet earth.
Do you think all Tesla buyers are skilled software engineers or ML researchers? I guarantee you that a lot of people believed Tesla’s and especially Elon’s claims that this was right around the corner, and don’t have the appropriate experience to understand that there are multiple fundamental improvements needed before this will actually deliver. I know multiple people who are quite smart but didn’t have the experience to understand that it’s not like, say, buying a game at launch expecting it to be patched in a few months but more like a bunch of people doing Ph.D-level research which likely won’t ship in the life of their car.
Keep going: you do that, and you find a Tesla fanboy’s reaction saying the problems are overblown and will be fixed soon with a software update. Who’s right?
Being able to answer that confidently requires some understanding of the problem domain which not everyone has. This is especially true for these big claims where people might think that a large company wouldn’t rationally make claims like this which would expose them to liability unless it really was almost done, not realizing the degree to which Elon will in fact take that risk just to cast himself as the hero of a 90s sci-fi novel’s back story.
> Being able to answer that confidently requires some understanding of the problem domain which not everyone has
Not really.
All it really takes is a brief look at history to determine that Elon Musk has been saying FSD is less than 6 months away for well over 5 years. Some basic pattern matching skill should instill the idea that FSD is not actually coming anytime soon, and that it's all marketing bullshit.
Spend time on a Tesla owner's Facebook group and you'll see that yes, there are definitely Tesla owners that think that. They buy a Tesla with FSD and then get upset when they find out that FSD isn't actually FSD.
But you can get upset when people ignore the numerous warnings from the car which beeps at you if you take your hands off the wheel or look away from the road for too long. I find I drive better with it, because the car literally forces me to pay attention.
I have no issue with the term autopilot. Boeing sells planes with autopilot, no one expects them to be fully autonomous.
The FSD name makes more sense for the software they've got in public beta than their current offering. Interestingly the only difference between "enhanced autopilot" and FSD right now is the ability to stop at stop lights.
Boeing doesn't sell to consumers, their customers are large companies. There's quite a difference in the level of knowledge/competence to be expected between those two markets. So "this is fine in B2B" does not automatically translate into "this is fine when targeting John Q Public".
No but they actually should. And every company either already has or is near an equivalent, so it makes sense to train drivers on what will soon be ubiquitous car software. Remember the goal here is to make driving safer. And there’s no evidence Teslas
System isn’t doing that. People crash and die without autopilot every single day.
I think you are being dismissive of the claim because who would think that "autopilot" doesn't mean everything is done for you.
Comparing a pilot to a regular car driver is just completely off point. You are basically throwing out all the training pilots have to do just to make your point.
Word has context and get interpreted by the person reading or hearing it. Advertisement is only misleading if people get mislead, which is subjective and involve judgement calls both on the reader but also by the legal system that think that people get mislead.
Pilot use technical terms that they learn in both training but also in the culture around pilots. Words used by pilots in that context can have a different meaning than the literal interpretation of those word outside that context by non-pilots. The jargon is different, and thus advertisement towards pilots using that jargon will be interpreted in that context. The word "autopilot" directed towards pilots will be interpreted in that context, meaning they will expect the technology to work as training and culture has made them to expect.
Driver training is very different from pilot training, and the jargon and lingo is different from pilots. An car with autopilot that is limited to work with the restrictions of a airplane autopilot, ie only work at airport or while high in the air, would likely not be what the driver expected. The same word but different context.
I don't have a problem with any of the terms personally, because I feel I have a good understanding of what it is actually capable of.
What I think is more important is understanding what the general population of buyers of these cars think it means. Whether they are wrong or right, what they think it means effects how they drive with autopilot on, and whether they are looking ahead or on their phone.
The only real way to know this would be to measure it, and I doubt that is knowledge that Tesla wants to know.
As a side note, the Boeing pilot will spend a great deal of time looking at charts or their company laptop once they are at cruise with their auto-pilot on.
Before Tesla came out with autopilot, Google had done extensive user research on "autopilot" cars and found that no matter how many times you tell users that they must be fully attentive, people WILL go on their phones or stop paying attention. It's the entire reason Google didn't go to market with their technology before Tesla
Then Musk came along and completely ignored the research (which he must've been really aware about considering Google was in negotiations to buy Tesla) and even took the advertising a step further. He KNEW people wouldn't pay attention but sold the cars with that marketing anyways. IMO the blood of the 15 people who have died due to autopilot is directly on his hands as well as the thousands of other injuries due to autopilot accidents
Tesla's still have problems. They made up 273 accidents involving driver-assistance systems in 2021. 70% of all such accidents.
I really hope FSD isn't heading down the path of setting a dangerous precedent that tech advancements must be perfect before society utilizes them. We have a technology that we know will reduce death and injury and likely by a drastic margin, but we're not utilizing it because it isn't perfect yet?
Those statistics are for Autopilot only. They hide FSD data.
That data is also super misleading as they compare Autopilot highway crash rates with statistics for an average car (which is older and has fewer safety features than newer cars) that includes city driving (which is much less safe than highway driving). It's designed to advertise Autopilot as safer than it really is. In reality, it's about as safe as any new car with standard safety features (AEB, collision avoidance, lane keep etc).
I have to drive with Teslas every day on my way to work.
Teslas drive like drunk drivers, whether using AP or FSD.
Tesla fans might feel like they are safer, but from the perspective of someone else on the road, they are objectively less safe, unless you consider swerving, randomly slamming on the brakes, and tailgating to be examples of safe driving.
And Tesla's AP system has more recorded fatalities than all of their competitors, combined. By the numbers, NHTSA data shows that Teslas are 50x more likely to get into an accident using AP or FSD than a Ford or Toyota; and the nearest legacy automaker is still 1/3rd as likely to get into an accident as a Tesla using AP/FSD.
Is that a AP/FSD thing? I ask only because just this weekend there was a Tesla on my rear bumper in a 30mph zone, and I had the cruise set to 34mph. I did wonder whether it was the computer, or if the driver was just an ass. Regardless, after a number of miles I grew tired of it, let's give 'em a healthy brake tap. Just enough to get the hint, I don't want anyone spilling coffee on themselves. So I give the brakes a good jab, and immediately back off.
Next thing I know, I hear the tire chirp of ABS kicking in, and I look in the rearview to see the Tesla stopped in the middle of the road. Oops, sorry (not sorry?), not exactly what I was going for. OTOH, maybe you or the car will quit riding people's asses now. And I'll admit I got a good chuckle out of it.
We can't know, Tesla is secret about the real data.
From what we see in videos where people are testing FSD and the human had to save teh day many times in 30 minutes. So for just that 1 car you would have had 1 death if the human was not there.
IF we would have the number of how many times the driver would have been hurt if he did not take control then we could have a better guess how many kilometers a Tesla would self drive until would kill or hurt you,
Tesla isn't hiding data, they simply only have autopilot data on highways, which their autopilot data suggests it's safer than human pilots on highways. Which to me suggests we should be putting social effort behind making autopilot (on the highway, at least) the norm.
The issue is that you have fanboys that are idiots and they will conclude this
Human + Autopilot > Human => (implies) AI is better then human
FSD > Autopilot
Conclusion FSD is better then humans.
Nobody said that installing a good drive assistant is less safe, the issue is with idiots using this stats to imply the AI itself is better then human drivers wich is a bad conclusion to draw from this.
To prove FSD is better then humans, send a Tesla from A to B and note hoe many kilometers until it screws up. Repeat 1000 times then do the same experiment with 1000 humans. Don't forget to repeat after each update.
They already have a safety score you have to maintain in order to qualify for FSD beta, so this seems like an imagined non-issue. Simply don't allow FSD for people who drive recklessly.
You are one of the people that seem not to understand statistics, how can you answer if FSD is safe if the only data is when FSD is baby-sited by a human with a high score?
Go to Tesla subbreddit are all hard core fans are and read about this topic, you will see many example of users having issues with this feature, the FSD is not safe, it will kill you if you don't pay attention, it might kill some innocent person because some random dude is alpha testing updates and maybe is also live streaming it for fun.
I REPEAT, Tesla is not sharing the data so if someone is telling you that the AI alone is better then a human then he is lying, there is no data to show this.
I would bet Autopilot/FSD is unsafe alone(without the human ) and the issue then raises because of marketing where someone say rents a Tesla, he believes Elon and does not pay attention and kills someone. I seen even Tesla fanbos acknowledging that FSD is not safe and just read a comment where one such fanboy had to explain his relative that FSD is a scam and not to buy it. Also there is a post ATM on Tesla reddit about someone that does not trust his relative driving his car with the autopilot because this person is failing to pay attention when using the AI, the dude was asking how could he make it impossible for this relative that shares the car with to use the AI.
No they are. They won't break out FSD and they 100% could
> their autopilot data suggests it's safer than human pilots on highways
Their autopilot data compared to every single type of car, with every single type of driver, only on highways.
So we're comparing every 1994 Corolla with a teenage driver to every 2021 EV with a 30 yr old driver.
The reality is AP should really be compared to modern cars which tend to come with LKAS and AEBm and stats should be adjusted to reflect the differing driver populations.
But Tesla would rather throw up comparisons that make themselves look good, but only past muster if you don't think about it too much.
> Their autopilot data compared to every single type of car, with every single type of driver, only on highways.
Is there new data I missed? Last I was aware Telsa had published data comparing autopilot to human drivers, but not adjusting for road type, weather, road condition, construction, and any number of other factors in addition to the ones you mentioned.
So basically Telsa determined that autopilot on easy-mode driving (highway, clear enough conditions for autopilot to be willing to operate), with a human backup, has less crashes per-mile than humans in the full range of conditions (including those where autopilot will opt out) with no backup.
If that's not an example of "lies, damn lies, and statistics" I don't know what is.
I think you're misunderstanding the quoted section, my exact point is they're comparing a narrow section of drivers on highways (people in Teslas who enabled AP) to all drivers on highways
The narrow section excludes older vehicles, skews away from the most at risk demographics of drivers, removes the toughest highway situations from the equation since AP will disengage, the list goes on.
Maybe you want to make you question more clear, do you mean Tesla with FSD, or only with autopilot, or with autopilot off, or in US highways, More lethal then all cars(old or new, cheap or expensive), motorcycles, trucks. If you compare with other luxury cars with similar safety system you might be disappointed and if you would then be honest ab out safety you would have to stop propagating the fake stats otherwise Tesla shares might fall and lots of suffering will happen for the fans.
If you question was if self driving is safer then humans the answers sr probably no, if it were even close Tesla would have shown the numbers not kepp them hidden.
Yes. With 21 known fatalities in the U.S. alone, Tesla has more advanced driving system deaths than every other carmaker in the world, combined. (For the record, the combined total for every other manufacturer is 1: the Uber pedestrian fatality that doomed Uber's self-driving program.)
Okay, but entirely meaningless unless you adjust for miles driven with an advanced driving system. More people are killed with pistols than assault rifles. Are pistols more deadly?
By any adjustment Tesla leads the world in terms of fatalities related to its advanced driving system. All major automakers have released hundreds of thousands of cars with advanced driving features over the past 5 or 6 years, and Tesla is the only manufacturer worldwide to have recorded double-digit fatalities.
And that doesn't include the thousands of accidents related to Tesla's AP/FSD systems.
Tesla's advanced driving systems are objectively less safe than teenage or drunk drivers, by miles driven, and the sooner AP/FSD is banned, the better.
Google had hands-free driving working on highways a decade ago, but decided they didn't want to take the incremental approach of L2 -> L3 -> L4 exactly for the reasons you mentioned and jumped straight to making L4 (fully driverless) work.
GM has taken the incremental approach and has been selling the L2 Super Cruise feature for several years. So far it seems to have a good safety record. They plan to start selling an "Ultra Cruise" system next year with autonomous driving on a wider range of roads, but it will still be L2.
Mercedes-Benz had hands-free driving working on highways in 1987 (before Google even existed). However, their approach relied on expensive specialized hardware and wasn't commercially practical.
IIRC, Super Cruise hasn’t been around for long and is only available in certain GM vehicles. Their usage is certainly not at the level of Autopilot in terms of miles. I think when they hit that scale GM will also find out L2 systems are only as good as the driver’s attention span.
Yup and Super Cruise was essentially the same as Tesla's initial "Autopilot" but they didn't wanna deal with getting sued and being held liable for deaths. Something that Tesla, until recently, seems to have been immune from
As tempted as I am to agree, this isn't the direction that anyone in the industry is going. There are lots of lane keeping assistant systems coming onto the market now with similar capabilities to AP.
After watching thorough video reviews, the differences between the best systems are fairly minor.
Making it all about Tesla will do the industry a real disservice. Thankfully NHTSA seems to be aware of the issues and is taking their time to understand fully before making adjustments to regulations.
A commenter above pointed out GM's "super cruise" feature. A much safer choice of language for the same thing Tesla was initially offering
Tesla chose to go with "autopilot" because they sell people what they wanna hear. It's irresponsible and unsafe
> This is another version of the "guns don't kill people, people do" argument.
And no actually this is quite the opposite argument.
> The data shows that people are more irresponsible with the autopilot technology. Does this fault fall on the drivers or manufacturer.
Just like the fact that tools are specifically built in a way that makes them much more likely to be used to kill people, the fact that Tesla's are made and marketed the way they are makes them much more likely to be lead to people being irresponsible
I think you got the arguments backwards buddy. you're exactly the one who's making a version of the "guns don't kill people, people do" argument
I'm just saying it is the same argument, not that you are taking a given position on it.
The question of if Tesla sells a more dangerous product is different from the false advertising claim. Both could be true, false, or mixed.
I personally think FSD is false advertising but the name doesn't make the product anymore dangerous then if it would have been if called "super cruise", "driver assist", or whatever with identical features.
The features themselves may also be more dangerous, but the question of if that puts blood on their hands is the gun debate all over again.
Fast cars and cheap cars are also more dangerous, ect
Nobody thinks its okay for the pilot to take a nap on their flight despite the term autopilot originating from flight software. The reality is people just don’t fear or respect driving the way they should. But how many millions of people have died because they text and drive without autopilot? Or drive drunk without autopilot? Do alcohol companies have blood on their hands? Bars? Cellphone makers like Apple and Google? What about the people who let google maps instruct them to drive off cliffs or into rivers? Does google have blood on their hands?
Other companies do not fully collect the data on crashes while autosteer type features are active and how many miles they are driven. Comparison is difficult.
NHTSA is making headway, but its going to take time to reach accurate conclusions.
The car was fun to drive, but the build quality and service were terrible. And the continuous upsell to FSD, even 8 years ago, was such a hassle.
reply