Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Folks are glossing over this quote too easily:

> “Historically human beings have shown zero tolerance for injury or death caused by flaws in a machine,”

It doesn't matter if it's objectively safer; people have no issue doing dangerous things so long as they can maintain the illusion of control.



sort by: page size:

This doesn't seem hard to understand at all to me.

Someone getting hurt or killed by a machine in a factory likely wasn't following the proper and ideal safety protocols for one reason or another.

Lots of people are careless and don't care enough about safety rules in my experience. So it's not hard to end up with a lower number of total incidents when robot are put in charge.

But for some person who is not careless and who does follow safety protocols religiously, a robot causing them harm could be outside of their own control, even if they are following all the safety protocols and are always very careful.

Can you see how that could be a problem?


Yeah, it's a different story when dealing with safety.

With industrial machines you have to assume people will do the dumbest thing possible. Because someone will find a way to get crushed in a moving part if they can.

You have safety fence after safety fence and regularly test that your lockouts work.

If you don't do all of this and someone dies you can face very harsh legal penalties.

We don't do this with software when the cost of failure is so much lower but we should still understand smart humans will make mistakes.


When my parents were children, everyone heated their houses with an open fire, and if you did something wrong on any day it would kill you. When I was a child, we had a boiler which was perfectly safe so long as you paid someone to service it once a year. Pretty much all the things my parents grew up with had no safety features, pretty much all the things I grew up with did.

We have moved to a society where things are expected to be safe except for a number of well known exceptions, and this is good:

"It is a profoundly erroneous truism, repeated by all copy-books and by eminent people when they are making speeches, that we should cultivate the habit of thinking of what we are doing. The precise opposite is the case. Civilization advances by extending the number of important operations which we can perform without thinking about them. Operations of thought are like cavalry charges in a battle — they are strictly limited in number, they require fresh horses, and must only be made at decisive moments. " Alfred North Whitehead

As technologists it is our business to know and understand the consequences of the mechanisms we build. We value greater understanding and perception. But it is a mistake to project this value system onto society as a whole, since, as the quote shows, the worth of what we build is precisely to allow others to avoid this burden. That applies in safety as much as anything else. As technologists we can see that if a device has a 500W motor in it , it bears thinking about whether there are any safety issues. But consumers rightly expect any safety issues to be pointed out to them.

What worries me is when companies are run by people who don't understand that all our safety is the result of a lot of work, and so don't realise the amount of diligence required.


All true, and that same machine should be produced safely, because the danger is known and how to mitigate it is also known.

They either were incompetent or did not give a fuck, neither is acceptable.

Secondly, because these are known dangers and mitigation are known, out there, expected, ordinary people may actually see more danger because generally set expectations do not match reality.

The reality being this machine is a bigger risk than may be expected, and it does not have to be. Should not be.

They can afford for it not to be too.

Any competent product safety people would have required the basics, which would take this story off the table.

This company does not have those people, and they should.

They can totally afford it.


I'm just saying that the average person has contact with dangerous machines on a regular basis, compared with the past.

The language "accident" (and the associated story) implies that one cannot control whether they injure someone with their 2000lb death machines. That is not true.

Why is safety bad? That's a werid claim...

Rossmanm mentioned that in the focus groups he ran people tended to believe that a manufacturer was more competent at repairs by default, even without evidence, but it's worth saying:

Show me any actual data that people being able to see human readable error codes would result in more injuries.

Even if that were somehow the case, demonstrate that the status quo would be less dangerous than putting the company in a position where it had more of an incentive to make ALL repairs safer. If a repair process is hazardous enough to make the machines barely safe enough to use if maintained by authorized repairmen, than that status quo discourages the company from making the repair process safer in general.


> When people know that there's no assurance that the things…are "safe", … they tend to be a lot more careful. This is a good thing.

No, it's not. I shouldn't have to check the piping of my gas heating system for leaks, or the wiring of my lights for shorts, or my car not to fall apart on the highway, or the food I bought to poison me. If I, and everyone else, had to do those things, I wouldn't have time to do anything else, so I wouldn't bother with those things and I'd live in a stone hut.

You could argue that safety standards aren't strict enough, or aren't enforced (yes, we should address that litany of chemicals!), but there's no question that safety standards significantly improve our collective quality of life.

P.S. Add "computer security" to that list of things I shouldn't need to care about.


Safety is important. Forcing a design of a product to your objectively high standards is not.

Sometimes the user is to blame.

We can't eliminate all possible danger and coat the world in foam rubber because someone people are accident prone.

That's not user-driven logic, that's bureaucrat logic, and pushing for education as a way to mitigate the dangers isn't some kind of 'geek logic' that mainstream people couldn't be bothered with, it's very simple logic.

There are infinite threats. We all have our own whitelists and blacklists.

I don't want to live in a world without electricity, cars, swimming pools, stairs, and knives because some people may hurt themselves, because I may even hurt myself.

Potentially getting fucked over should be in the TOS for human life. Click here to agree or sit in the corner and make collages with non-toxic glue, magazines full of approved harmless imagery, and safety scissors.


I noted that the criticism in the article attempts to frame on safety grounds.

I thought a lot of safety thinking was built around acknowledging that humans don't do everything perfectly all the time.

No, it's like saying an adult professional operating a deadly machine should learn to do it safely.

Being responsible if you screw something up is how the world normally works. Just because something can be unsafe if done wrong doesn't mean the thing you are doing is fundamentally unsafe when done properly. Many people don't seem to understand this.

The innocent bystander is indeed the reason. People working with machinery accept certain risks.

If I were working in a factory, I would probably consider it less safe to strap myself to a powerful piece of machinery that may (or may not) be under my control. That said, I'm sure they are taking safety very seriously.

"zero tolerance for safety hazards."

Well everything has some risk. There should be some small tolerance, or else we wouldn't do anything.


> heavy handed and naive

It's good thing that the "safety measure" is the way it is - an afterthought. It means that those ideologues haven't yet had influence on the model itself.

next

Legal | privacy