I feel like this confuses a lot of things by assuming an extremely sophisticated end user. Sure, if you're a double-threat dev/sysadmin using linux, then when some vulnerability gets discovered in some dynamically linked library on your system, you have the capacity to (a) receive information about that fact, and (b) update it.
But now suppose you're an ordinary person. You use software. Maybe you even have a windows machine.[1] Which is more likely to actually get a security update to you?
(a) You have to update a single piece of software, which you know you've installed, though a recognized distributional channel like an app store or something, and all its dependencies come with it.
(b) You have to either learn what a DLL is and learn how to update it and then hope that nothing you rely on breaks in some mysterious way because of some dependency on a dependency on a dependency on a dependency. Or you have to accept a whole operating system update, assuming that the operating system comes with a DLL update---and hence accepting all of the other crap that comes with operating system updates from Microsoft (and Apple), such as bugginess from complex updates, incompatibilities between new versions of operating systems and software (or hardware) that you rely on, new obnoxious security rules that you might not agree to (looking at you, Cupertino), and massive disruptions to your ability to actually use your computer to do your work.
No thanks.
[1] Maybe this article is specifically targeted against the linux ecosystem? If so, perhaps this issue is ameliorated somewhat, but it still seems to put a fairly substantial burden on end users, that seems to be inconsistent with actually letting non-experts use linux OSes.
On the other hand users are generally pretty poor at managing software themselves and as long as it works they'll happily and probably ignorantly run something that is not secure already and needs an update.
This does not surprise me. Updating software takes time, effort, and is risky. It's also not fun. The result is a lot of people ignore it even though it means their software can be easily hacked. Note I think people should keep their dependences update to date. Unfortunately, I also know human nature and that means I know many won't.
You see a similar problem with obsolete computers, operating systems, phones, routers, etc. People keep them connected to the Internet even though they have known vulnerabilities. People who do this will even claim they have not been hacked.
It may be true that "normal" users don't understand security or take it seriously enough, but in my opinion just blaming them isn't fair.
Imagine your car being painted in new colors and handles in the cockpit being re-arranged in unpredictable ways every time you have it serviced.
That's basically what Software updates often do to users.
We constantly force users to re-learn how to use a piece of Software, very often without good enough reason. Additionally updates at some point force them to buy newer hardware, even though they probably neither wished for the changes in the Software nor for new hardware.
That's why I totally understand casual PC users who're not gonna stop using Windows XP as long as it lets them do what they use their PC for.
In my opinion commercial software should be regulated to either provide security updates (distinct from feature updates) or be open sourced.
[Non-experts] mistakenly worry that software updates are a security risk.
I think this betrays a lack of thought about the risks to non-experts. Tons of malware masquerades as legitimate updates, and non-experts don't always have the knowledge to distinguish legitimate updates from malicious ones. Therefore, to non-experts software updates are a security risk.
Edit: And this is why Chrome's policy of updating automatically and completely silently is the right thing to do, and everyone else (Adobe, Oracle, Microsoft, looking at you) is doing it wrong.
Alternatively, realise that the problems in (b) are largely solved for reasonable OSes where the vendor takes responsibility both for automatically getting you security patches and for keeping you working without major disruptions.
I'm not sure where you've got the idea that updates are all-or-nothing, but some of us have been living a life you seem to think can't exist for decades at this point.
The user is not usually in a position to decide if they need updates or not or to judge whether or not they are putting themselves at risk by not updating the machine.
That's a terrible analogy. I have been shaming people for bad security practices and self-righteous ignorance for many years (even before 9/11 ironically!).
I've seen too many people have been wrong and had a bad outcome including complete data loss and in one case livelihood being shot entirely. This isn't a random assertion from thin air. You can't trust people to look after their computers.
Were people not updating to more modern OSes because they didn't want new features or because they didn't want to spend the money on new licenses and testing software compatibility?
And how sure are we that they didn't install security updates out of sheer laziness or hubris?
People who run systems that store sensitive information and systems should take computer seriously more serious than the people on Hacker News. I would never allow my my smartphone, let alone computers and servers to run unpatched software. Why is this acceptable for people who have critical systems and data?
Again, this is conflating security patches with more general updates.
As a personal anecdote, the only serious malware that has ever hit any system I run, as far as I'm aware, was a zero day exploit. The system was fully patched when it was hit. In contrast, the amount of productive time I have spent over the past few years recovering from problems caused by non-security-related software updates that I didn't particularly want but couldn't avoid if I wanted to keep the security patches is probably measured in weeks by now.
I'm all for keeping systems secure, but when updates start to take priority over keeping systems useful, you have a problem. Most security patches are fairly low risk and have few if any unrelated side effects anyway, but that is certainly not the case with modern software updates more generally. Just look at the frustration of browser users with Mozilla constantly rearranging the UI or Google actively removing functionality from Chrome, or of course the number of users who never moved from Windows XP to Vista or from 7 to 8 because the changes weren't considered improvements.
In the brave new world of Windows 10, the average individual user will be stuck with all the updates, security or otherwise, whether they want them or not. There's really no excuse for that, even in a consumer-focussed OS. Install updates by default, so less technical users get what they probably want? Sure. Block even knowledgeable users from choosing whether to install specific updates? The only time that makes a difference is if Microsoft want to force an update that the user does not want.
Well, maybe. On the other hand forcing someone to update often reveals that updating does not in fact break the world, particularly in linux-land where API updates are supposed to not break userland. Many times users who don't understand the risk incorrectly assume worst case scenario.
There's this weird category of people who think that
- Updates are so important for security you have to install them the second they come out (some are, but most are not), don't you dare even THINK about using outdated/unsupported software that works just fine
- You have no right to mess with the software running on the hardware you own
And? How do updates help any of this? Firewalls are a thing. Memory-safe languages are a thing. Unit tests are a thing. Fuzzing is a thing. And it is not an OS's job to protect the user from themselves (i.e. social engineering). If you've installed malware, you deserve the consequences and you will be more careful next time. It's okay for powerful technologies to require a minimum level of education.
What's a security update? I don't expect them to be responsible for my security - that's my burden as the user to make sure that I'm using software and files I personally trust.
True, but I don't think that justifies the practice at all.
At the very least, software needs to do what it used to do: make security updates separate from all other updates so users can just get the security bits.
You act as if people should prioritize downloading security updates over actually being able to reliably use their computer. For many people, they want to use their machine first before worrying about security.
And as another sibling commenter mentioned, what about crunchtimes, are those the best times to figure out what works and doesn't?
If programmers could write secure code from the start, then security updates wouldn't be needed.
Years of experience have shown that programmers can't write secure code from the start. Maybe some day we'll have languages & tools that allow for network-connected programs that never need security updates, but today is not that day.
Attackers often exploit vulnerabilities reasonably quickly, so updates have to happen soon after vulnerabilities are discovered.
So we need timely updates, and we keep needing updates. Unless we define "updating software" as not contributing to maintenance, I'd say my point stands.
The maintenance is made even harder for distributions like Debian that want to backport security fixes without backporting feature changes or other refactorings. That produces a lot of extra work for the maintainers, and those maintainers aren't usually as familiar with the code as the authors further increasing the maintenance burden.
It's easier to do an update with a single security fix rather than an update that rolls in a ton of new functionality that ends up breaking your device. Seen this time and time again with OS/dependency update.
But now suppose you're an ordinary person. You use software. Maybe you even have a windows machine.[1] Which is more likely to actually get a security update to you?
(a) You have to update a single piece of software, which you know you've installed, though a recognized distributional channel like an app store or something, and all its dependencies come with it.
(b) You have to either learn what a DLL is and learn how to update it and then hope that nothing you rely on breaks in some mysterious way because of some dependency on a dependency on a dependency on a dependency. Or you have to accept a whole operating system update, assuming that the operating system comes with a DLL update---and hence accepting all of the other crap that comes with operating system updates from Microsoft (and Apple), such as bugginess from complex updates, incompatibilities between new versions of operating systems and software (or hardware) that you rely on, new obnoxious security rules that you might not agree to (looking at you, Cupertino), and massive disruptions to your ability to actually use your computer to do your work.
No thanks.
[1] Maybe this article is specifically targeted against the linux ecosystem? If so, perhaps this issue is ameliorated somewhat, but it still seems to put a fairly substantial burden on end users, that seems to be inconsistent with actually letting non-experts use linux OSes.
reply