It really doesn't. I've worked on upstream projects where a significant fraction of all bugs reported were created by distributions screwing up packaging and patching in ways they weren't at all qualified to understand and frequently led to non-obvious failures.
When we tried to work with them to fix this, about half the time they flamed us and quoted distro 'policy' as a reason not to fix their bugs, so we just refused to accept bug reports from anyone using those packages anymore.
The fact is that a lot of old-school Linux distributions are built by people who have only a very vague understanding of the software they're packaging, and frequently are closer to the sysadmin side of things than the large-scale software development side. It makes the relationships very frustrating and that's why proprietary software vendors invariably opt-out of distro packaging. Even with apps statically linked to the max possible level Linux users generate disproportionate levels of support tickets due to the general flakiness of the distros they use, so allowing them to modify tested software even further is a losing proposition.
Basically the whole concept of a Linux distribution is obsolete, fading away and irretrievably broken. Hence the proliferation of containers.
I see your point... but realize that 99 percent of such bugs are upstream. So it's a matter of locking packages, or adding hacky things to a shell script that runs at install time. In a lot of way, it feels at times a distro maintainer is a secretary between the end user and upstream packages, since we end up opening the bugs.
So what are you saying? Developers should just choose a major distro and target that, then get all the flack for issues their product has on every distro that isn't supported? Because that's basically what's already happening and developers hate it, which is why they often don't ship on Linux at all.
And many proprietary pieces of software license components from other proprietary pieces of software, so that even if they did want to open their code they'd have to strip pieces out of it anyway which doesn't really help the cause of distros integrating it. And even if it did, then the developers are reliant on the maintainers for their relationship to their customers. Have an issue with the product? Oh, it turns out that's because of this patch made by the maintainer of the package for that distro, who now has to be contacted and convinced to fix it, which they may decide not to for arbitrary reasons. Even open source developers have problems with this!
My experience when I worked on upstream open source software was that distribution package makers would routinely introduce subtle bugs into programs via the act of packaging. In fact on the project I worked on we stopped supporting users who didn't use our own upstream packages because the number of bugs introduced by downstream packagers was just so huge.
I can't see any real benefits to the Linux approach and never could. It's one of the reasons I ended up moving to macOS. There's hardly any malware there too and yet app developers build packages themselves.
The package maintainer system doesn't add additional people to share the same work, it creates additional bits of work for different people to do. Upstream can release a fix, and it doesn't get propagated to people on distro X because the package maintainer for X is busy with work, or is a parent now, or just isn't interested in the package any more. And if one proactive maintainer patches an issue, it doesn't help users on all the other distros, or users who get it directly from upstream.
You're explaining how this works in response to concrete examples of where it hasn't worked. I understand how distro packaging works for some packages, but I've seen it fall down too many times for others, especially more niche things.
I imagine it's much easier to maintain a working release for one distro and allow the package maintainers for other distros to package it how they need. Than it is to provide support for the endless amount of distros out there.
No, Linux distributions offer packages and operating systems that are the result of painstaking work in which all upstream code is reviewed, patched for any inconsistency, and often blocked from going into public archives until known bugs are fixed.
I cannot agree with this enough. I help package the container tools for openSUSE and SLE (I'm also an upstream maintainer of one of the tools as well, so I see both sides of the picture). But I personally am against ISVs making packages (especially "universal" ones) -- if they want to provide a container deployment method then provide the Dockerfile so people can build and curate it themselves.
It's really frustrating when upstream turns around and says "actually, we provide packages -- not you". In fact, we've had cases where a certain project suggested that a reasonable compromise would be that we change the name of the package in our distribution to "reduce confusion to users". Wat.
However, I do understand their point somewhat. If you have users pinging your issue tracker and they're using distro-supplied packages, then you are getting a lot of noise. But the right way of reporting bugs is to complain to your distribution (we actually live and breathe the distro you're using!) and I feel like upstream projects should make this communication model more clear rather than complaining to distros about users submitting bugs in the wrong place.
EDIT: Thanks for the SUSE shoutout! OBS is pretty cool and the people developing it are pretty cool people too.
The conclusion I take from this is that distros need to be a lot more selective in what they package. If packagers can't reliably backport security fixes for the several years that a distro release is supported, they shouldn't create that expectation by putting the package in.
Most of the time those packages, or the build files for them, are contributed by the distributions themselves, or avid users of those distros. (Reference: I have multiple software projects I've created that are a part of every Linux distribution. I've never packaged any of them myself.) Even big enterprise companies usually work directly with the distros under NDA, and the distros produce the packages.
That works for FOSS, but not for smaller closed-source consumer apps. And it's not just building, it's also testing and debugging. It's just not tractable for a small company to test on dozens of different distros / versions for such a small percentage of users.
What about it? It doesn't solve the problem, it just allows you to conveniently package everything you can't rely on the distro having, which is basically everything.
Most people don't package their software for distros.
Smart distros keep things close to upstream so the burden is small.
This is a weird argument.
Now you want each developer to manage dependency updates for their application? I don't trust them to develop their own app let alone apply security patches properly.
Package maintainers of a distro can do absolutely anything to a package. With zero input from upstream developers. Some distros have more tradition for patching software than others. An upstream like systemd (or openssh) can hardly be blamed for what others do with their software.
Time and time again I'll keep saying this: This problem is only solved with package repositories that require review by a maintainer to publish. Linux distributions solved this ages ago.
That's why distros ask you to provide just the sources and we'll do the packaging work for you. The upstream developers shouldn't need to provide packages for every distro. (Of course you can help us downstream packagers by not having insane build requirements, using semantic versioning, not breaking stuff randomly etc).
History has shown that distro maintainers aren't perfect at patching security vulnerabilities either and that sandboxing is useful regardless. It also shows that user want working software and will go through the effort of inventing new package formats like flatpak to work around distro maintainers. Maintainers now have a choice between complaining that everyone else is doing it wrong and eventually becoming irrelevant, or getting with the program and maybe even offering their expertise to accomplish what people want to do
This is missing the point. Linux distributions do not lack package management options, they lack stable and sane APIs for developers to work against. No amount of static linking and binary bundles can fix that.
When we tried to work with them to fix this, about half the time they flamed us and quoted distro 'policy' as a reason not to fix their bugs, so we just refused to accept bug reports from anyone using those packages anymore.
The fact is that a lot of old-school Linux distributions are built by people who have only a very vague understanding of the software they're packaging, and frequently are closer to the sysadmin side of things than the large-scale software development side. It makes the relationships very frustrating and that's why proprietary software vendors invariably opt-out of distro packaging. Even with apps statically linked to the max possible level Linux users generate disproportionate levels of support tickets due to the general flakiness of the distros they use, so allowing them to modify tested software even further is a losing proposition.
Basically the whole concept of a Linux distribution is obsolete, fading away and irretrievably broken. Hence the proliferation of containers.
reply