Last time I looked at this, they punted on interoperability with the system package manager. Are they likely to ever care about this, or is it just for people who aren't appalled at the thought of polluting individual machines with random libraries whose interdependencies are invisible in the dpkg or rpm database?
Yes they can, there is nothing preventing them creating the package. But it shouldn't be a primary way of installing package. If you really need bleeding edge for development you can build it yourself.
You should be really careful about using libraries from distribution package managers because none of them can resist screwing with the source, which often times makes them unusable. Even the vendors themselves say don’t screw with the packages and maintainers do it anyway.
Linux distros have the same issue, rpm or .deb packages cannot have the same name but still support having 2 versions of a package/library installed.
If it's a shared library and they don't get the SONAME versioning right for incompatible backwards compabiliry, it's even more cumbersome, and incurs more work for everyone to consume that library.
Yea, I really meant merging in the functionality of the most popular packages, and keeping as much of the syntax as possible without compatibility issues. I agree with you about those risks.
The sad thing is that this isn't an either-or choice. Applications could provide a precise list of dependencies and ranges that they need. Distribution package managers could try to respect those as much as possible, but still have the authority to override a library or two due to security issues, in a centralized manner.
Everyone could be reasonably happy if both sides did some work to understand each other and support each other.
Another disadvantage: the specific package you want might not be packaged by your distro, and may not be compatible with the versions of dependencies that your distro does package.
Sure the problem defaults to the package maintainer, but historically popular and/or useful packages have seen their way into the main GNU distribution. Just leaving the possibility open has some long-term consequences for the project.
What helps is companies would sometimes disallow access to external feeds like nuget.org and instead host internal feed with pre-approved packages and their versions.
Luckily, the standard library is so extensive you rarely find yourself reaching for an external package for a piece of basic functionality, and companies often are extremely averse to taking on third-party dependencies to a point of unreasonable (you have to go through sec dept even for otherwise popular and actively maintained packages which really doesn't help as it leads to NIH proliferation, doing more damage than good).
It is probably less prevalent in small size businesses however which has its advantages - the productivity loss and maintenance effort is highly likely outweighs whatever security benefits the above approach may bring, if any (good OSS package is usually more secure and does a better job than a particular team often unequipped with sufficient skills).
Not the GP, but I note it's difficult to, say, take some code which uses package xyz and substitute the source-compatible package abc without editing every file. e.g. Try compiling a large project with a custom version of 'os' swapped in. (e.g. To add tracing, or simulate random I/O failures.)
Some package management systems can decouple what packages are called inside the code from what they're called outside.
reply