Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Why do only *nix folks pull their hair out about this? On Windows programs bundle their dependencies all the time (sometimes even as shared libraries! but without the independent update benefits) and hardly anybody loses sleep over it. Heck, users actually like the fact that it minimizes friction. Nobody claims it's rock-solid security, but does it need to be?

Actually, now that I wrote it above, I think I might have found an answer to my own question: while of course infosec experts will freak out about any vulnerability existing anywhere for even a shred of a nanosecond, in the real world this is really only a big deal for servers, not clients. And I'm guessing this issue affects Linux folks more because servers tend to run on Linux, with security-sensitive software exposed to the open internet all the time. Maybe we need to realize & embrace this trade-off instead of fighting it till the end of time?

(I suppose one could even argue servers don't need this either if all the packages are kept up-to-date anyway; I guess that might also be debatable, but it's beside my point.)



sort by: page size:

It's an easy solution for lazy developers. maintaining dependencies is a hassle, and with proprietary software users can't even do it themselfes if the developers decide that an 8 year old ssl lib is perfectly fine. The distributions probably won't keep that lib around either, so the easiest solution for proprietary vendors who don't care about dependency updates is to ship it all in a self contained bundle.

It also makes it easier to clean up when uninstalling and makes for better damage control by limiting what can be broken if done right, but like the article said this is not what it's actually all about since noone seems to care about these points.


This is why we need stuff like Nix or Guix, where you get the best of both worlds. Packages can depend on different versions of libraries, there is still good accountability of what each package depends on (critical to fix urgent security bugs), if same library is used it is shared, things can be easily sandboxed, and they are also reproducible.

They have the time and money to research this issue on every system, and they typically do bundle libraries.

I once saw a comparison with LibreOffice that showed that the the package Debian itself provided was 20% of the size of the package LibreOffice provided targeting Debian, — which would not receive the same benefits of security bugfixes to libraries, but of course also not the same problems that often arise on Debian when they arrogantly patch libraries they barely understand and create their own unique security problems.


Maybe bundling is fine for your in-house proprietary software, but it's absolutely not OK for free software where users and administrators need to keep on top of things like security updates. When projects bundle their dependencies, users become dependent on that project to provide critical updates to software that the project didn't even write. This multiplies for each piece of software that bundles their dependencies. It's simply unsustainable and irresponsible.

Most people don't package their software for distros.

Smart distros keep things close to upstream so the burden is small.

This is a weird argument.

Now you want each developer to manage dependency updates for their application? I don't trust them to develop their own app let alone apply security patches properly.


It's not a problem on linux whatsoever. On traditional package managers it just required some thought from the maintainer. Newer ones like flatpak make it even simpler.

Windows on the other hand just lets you bundle everything with your application including libraries that might have vulnerabilities without any form of sandboxing and no formal mechanism under which they can be updated. On linux on the other hand you can rely on the distro to ship patched versions of system libraries while the application itself is much more limited in regards of permissions.


Having a system-wide package manager where nearly all libraries are dynamically linked also has its drawbacks.

A seemingly minor update might cause a huge cascade of dependency updates which causes common Linux distributions to tend to one of these two extreme solutions: Either fix all packages in place and freeze their version numbers or just "give up" and update everything all the time. Both solutions feel like compromises to me.

Other end-user OSes don't act like this. On Android/iOS/macOS/Windows, I can have the latest 3rd party software without having to deal with intrusive updates to the OS infrastructure all the time. The BSDs handle this better, and maybe something like Ubuntu LTS + Nix on top of it might be a way around this.


The problem starts when you consider things like kerberos, where MIT Kerberos and Heimdal both implement the same basic kerberos api, but have mutually incompatible portions of the libraries -- there are functions that take different arguments depending on which library you're talking about, etc. So, programs that want to implement kerberos authorization need to either put in a bunch of ifdefs and the like to handle this, say that their program's just not going to work with one or the other, or bundle their own known good version. Or look at sqlite, where everyone and their brother ships their own version because there are a lot of defaults and settings which can affect behaviors that may be beneficial to an application.

Bundling is a necessity because software interactions are complex, and sometimes developers get tired of having to field support requests because packagers build programs with silly options. Including a known version of a library allows a developer to pin down the behavior a lot more, which loosens the burden on them because they don't need to worry about how Debian or Guix is going to screw up their programs.


It depends who your end users are. If the people responsible for running the code do it as their full time job, then yes, bundling the dependencies is more practical because you have a team of full-time engineers to handle security updates, rollbacks, etc. But if you're pushing it out to people who expect to just be able to fire and forget, not relying on your distro to manage dependencies for you is going to result in a world of hurt.

And then every package comes with its own libraries, which don't get updated and end up with duplicates everywhere. It's the same reason that Linux (the kernel) emphatically refuses to support out of tree drivers. It means that you have to make the effort to package it, yes, but once you've done that you get dependencies essentially for free. And as the end-user, I can update EVERYTHING on my system with one command, rather than the Windows hell of a dozen updaters running in the background constantly.

I think either approach makes sense depending on who you are.

If your goal is to distribute software across multiple distros and operating systems, bundling dependencies makes sense.

If your goal is to maintain a distro, shared libraries that you can apply a security patch to exactly once is obviously better.

But these are two different people with either goal.


Sadly, it's not a great solution.

It's not very user-friendly. Having stand-alone applications and double-clicking them is nice, having to extract and add your own shortcut to the executable isn't.

Distributing directories over the web sucks.

Both of these problems result in installers which encourage other undesirable behaviours (think: shared libraries, using the registry).

Users can't be sure that applications are standalone, meaning end users don't get most of the advantages of bundles because you can't rely on them working.

Without top-down guidance or enforcement, developers will do whatever the hell is convenient for them without regard to what's good for the end user. The New Old Thing has detailed this principle again and again over the years.

On top of that, many developers don't even like to accept that the shared dependency issue has, in practice, mostly been solved by not sharing dependencies and that the reasons against sharing libraries generally aren't comeplling anymore. I'm still grumbling that the developers of Haiku think that implementating a package manager with dependency resolution is a good idea when everyone else has figured out that it isn't years ago.

(Note: Linux is an exception. Package management with dependency resolution is necessary because Linux as an operating system is ultimately a web of interweaving dependencies rather than a coherent whole. It's a necessary, if not ideal, workaround for the problem.)


> Security is one big reason. Avoiding link time or runtime conflicts is another.

I mean, most Linux distributions do the same thing - anything that's "vendored" as part of some project is supposed to be patched out from the source and pointed to the single system-wide version of that dependency. And Debian is packaging lots of Rust crates and Golang modules as build-time dependencies within their archive.


> most of them are not even able to install two versions of the same library

That's entirely by design.

Distributions exist to provide a set of packages that are well tested, and work reliably, together.

And then guarantee that such set will stay the same and receive timely security updates for 3 or 5 or more years so that people can reliably use it in production.


Remember, it's basically only Linux distros that have the capability to upgrade a third-party library like that (ironically, distro maintainers actually have the source code required to recompile everything if they wanted!).

Mac and Windows applications ship bundled versions of third party libraries all the time. Managing a complex web of name+version based dependencies is much harder in a decentralized software ecosystem, so bundling starts to look attractive.

For our purposes, the benefit of a system where software is more reliable, predictable, and accountable is greater than the cost of asking developers to recompile in unusual circumstances.


Don't forget about trade-offs. Such kind of bundled packages have worse security than distro packaged method, where dependencies are getting patches and fixes. Because most developers won't ever bother patching their bundled dependencies.

So know what you are paying with.


The sad thing is that this isn't an either-or choice. Applications could provide a precise list of dependencies and ranges that they need. Distribution package managers could try to respect those as much as possible, but still have the authority to override a library or two due to security issues, in a centralized manner.

Everyone could be reasonably happy if both sides did some work to understand each other and support each other.


Thanks, this confirms my impression - the distributions are prone to think "we want to organize the software, but keeping up-to-date upstream software is too much work, so we'll use shared libraries and freeze everything to older versions".

Which I understand as a pragmatic solution, but it is far from clear that this gives users more security.


Every distribution has the exact same problems that are impossible to prevent. It's all about the inherent limitations of how software is developed and run in operating systems today. All this stuff about 'solving dependency hell' and dependency paths tied up with merkle trees doesn't change the fact that you can't use two completely different libraries in one application at the same time, nor shell out to two incompatible versions of a binary from the same application. It is only possible if the software itself is designed to do that, using the features of a dynamic linker that allow linking in multiple versions of the same function and specifying which version you're calling from what source function. We don't actually need a new package manager or distribution at all, because they can't solve one of the most fundamental problems - only developing software differently can do that.

Sorry, but Nix is a total waste of time.

next

Legal | privacy