FSD is really good -- but as with human drivers, it's not perfect (though much better than humans on average). If you pin the failure of a driver to oversee FSD on Tesla, Tesla will just be forced to cancel FSD. I'm OK with the bargain where they make FSD available subject to the understanding that I need to remain mindful of the system at all times, and that if I fail it's on me, not Tesla. The alternative, as we've seen with other manufacturers, is that they just won't have FSD. They'll call it that, but they'll make you keep your hand on the wheel and your eyes on the road -- which isn't FSD at all.
I genuinely fear that the US's culture of safety-ism, which informed much of the COVID response, will totally preclude development of awesome technologies that have very safe, but not perfect, records.
Tesla's system requires less supervision than other manufacturers. And that's where the rub is. They're saying -- hey, we'll make this tool available to you, and it really will function autonomously, but you have to know Tesla's not going to take the hit if there's an accident. In America's litigous society, I think that's the only way we'll ever get these tools. Otherwise, the plaintiffs' lawyers will destroy Tesla. I want the technology, and I'm comfortable assuming the risk if there's a fuck-up. If Musk needs a way to prove I wasn't watching and that the liability to transfer to me -- that's fine. I want the option. And Ford/GM/BMW et al won't give it to me.
If Tesla really believes it "really will function autonomously" then they shouldn't have a problem assuming liability. Further, if it needs to be supervised, then it's not autonomous, is it?
Congratulations on being so accepting of being sold a bill of goods, though.
> Tesla's system requires less supervision than other manufacturers.
> I need to remain mindful of the system at all times
> and it really will function autonomously
To me this is a contraction. It's said to be both autonomously and to also require constant supervision. And that's why I think it's marketed incorrectly despite it being an interesting piece of technology.
> I'm comfortable assuming the risk if there's a fuck-up.
Well, here's the rub. The risk is not just ours. It also involves others.
If said "fuck-up" is spilling some tomato sauce on the carpet, then sure, we can say "my bad," and take out our checkbook. It's fairly certain that the other person the risk exploded on, will accept our amends.
However, if it is running over a child, I don't think the checkbook thing will work.
> I want the technology, and I'm comfortable assuming the risk if there's a fuck-up. If Musk needs a way to prove I wasn't watching and that the liability to transfer to me -- that's fine. I want the option. And Ford/GM/BMW et al won't give it to me.
It isn't just about you, its about everyone else on the road as well. You don't automatically deserve to operate a less safe system on public roads just because you are willing to accept liability. Other manufacturers, at least with regard to autonomy, recognize that fact and design products that mitigate risks with a proper safety lifecycle and design domain.
There are serious statistical issues with Teslas claimed rates, but even so autopilot != FSD and Tesla FSD is, imo, currently benefiting from the left hand side of this chart:
Their disengagement rate is so high that as it stands it keeps drivers vigilant, but as the system improves driver vigilance WILL fade and without robust mitigations FSD will become less safe than a human for a considerable amount of it's development.
There are easier, faster, and cheaper ways to reduce traffic accidents rather than relying on hopium AI to solve all our driving problems, starting with reduction in driving and more transits.
I'll start listening to the mass transit hopium when it can take me all the way from my house and my work whenever I want. And, no, I'm not moving to accommodate your collectivist vision of transit.
> The alternative, as we've seen with other manufacturers, is that they just won't have FSD. They'll call it that,
1) no one else calls it FSD or anything like that. Every other car manufacturer with driver assistance tech calls it what it is.
2) other manufacturers define safe constraints on their tech and then enforce those constraints
> , but they'll make you keep your hand on the wheel and your eyes on the road -- which isn't FSD at all.
Super Cruise and BlueCruise are both hands free. Mercedes Drive-Pilot is licensed in California as Level 3 and won't require the driver to pay attention.
I genuinely fear that the US's culture of safety-ism, which informed much of the COVID response, will totally preclude development of awesome technologies that have very safe, but not perfect, records.
reply