Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

> In highly regulated industries like finance, bad software can cost you billions in fines from regulators.

You're assuming the regulators are low-level enough to identify a flawed system. A lot of regulations are applied at a policy level (aka process level) and auditors (internal or external) are not always able to identify severe gaps between what's documented in a policy and reality of day to day. Good people and technology matter way more than process.



view as:

A good process is supposed to protect you from not having good people (or, at least people who can't be good every day).

Sure, if we hire perfect people there's no reason to even have a process. In reality, when that breaks down you have to rely on your process to close those gaps. But, like you said, a policy/process is only as good as the paper it's printed on if nobody follows it.


edit I'm using security as an example of a regulated aspect of software/IT systems. I am aware there are others.

100% agree - My point was more that regulators tend to use policy and process artifacts as proof of security compliance. The idea that more highly regulated/audited IT sectors are inherently better at security is probably false. The audits just aren't stringent/detailed enough and there's aren't enough incentives for executives/boards to do more than the minimum. Which is why the values/opinions/ethics of those decision makers becomes key.

The most secure 'company' I worked for was a large research university because it was a priority from the director and we were constantly getting probed by overseas IP addresses. On the other hand, the worst company I worked at in terms of security was in the payment processing world. We were PCI DSS compliant and audited almost continuously by various interested parties (banks, card brands, third parties, etc.). I don't think any of our people were bad/inept but at the end of the day our policy artifacts made us sound/appear way more compliant than we were. The problem was upper management didn't care (allow us time or $$) about closing the gaps in implementation unless we failed audits or pen testing. We didn't fail but no one in IT thought we were doing enough. We didn't have enough auditing and regularly had inexplicable 'bugs' reported that honestly could have been evidence of network intrusion but we had no proof either way.


Legal | privacy