I think of it this way:
Most car manufacturers are releasing their products on the public year after year, knowing full well that a decent percentage of people that drive away from the dealership will be killed by the thing they just bought.
Tesla is merely trying to take the next step in reducing that percentage.
Their strategy is sound and we so far have not come up with any alternatives that stand a remote chance of improving safety as much as self-driving. Even if they are largely unsuccessful, they are indeed trying to ensure the safety of the public.
> Most car manufacturers are releasing their products on the public year after year, knowing full well that a decent percentage of people that drive away from the dealership will be killed by the thing they just bought.
Nope, and that's the whole difference. They will be killed by theirs actions, their choices, their inattention, or those of other drivers. They won't be killed by the machine.
With autopilot / pseudo-autopilot, they will be killed by the machine.
It is a huge difference, both in terms of regulations of people transportation safety, and in terms of human psychology, which makes a big difference between being in control and not being in control of a situation.
I agree that it is a psychological difference but our behavior as a society suggests a recognition that the machine and it's makers have a role in whether or not people die in cars.
This is why we demand safer cars and sue car makers when their designs failed or did not meet our expectations in protecting the occupants.
I can agree with the notion that the machine killed the person in all cases where the the machine does not include any controls for the person.
As a society, we currently recognize that the causes of accidents and the probability of occupant death are dependent on multiple factors, one of which is the car and it's safety features (or lack there of). https://en.wikipedia.org/wiki/Traffic_collision#Causes
We also already have a significant level of automation in almost all cars, yet we are rarely tempted to say that having cruise-control, automatic transmissions, electronic throttle control, or computer controlled fuel injection means we are not in control and therefore the machine is totally at fault in every accident.
Operating a car was much harder to get right before these things existed and the difference can still be observed in comparison to small aircraft operations.
Then and now we still blame some accidents on "driver/pilot error" while others are blamed on "engine failure", "structural failure", or "environmental factors".
I think having steering assistance or even true autopilot will not change this. In airplanes, the pilots have to know when and how to use an autopilot if the plane has one.
If the pilot switches on the autopilot and it tries to crash the plane, the pilot is expected to override and re-program it, failure to do so would be considered pilot error.
Similarly, drivers will have to know when and how to use cruse-control/steering-assist and should be expected to override it when it doesn't do the right thing.
Tesla is merely trying to take the next step in reducing that percentage.
Their strategy is sound and we so far have not come up with any alternatives that stand a remote chance of improving safety as much as self-driving. Even if they are largely unsuccessful, they are indeed trying to ensure the safety of the public.
reply