Well, why did you choose packages that change quickly and are poorly documented then? There are packages that, OTOH, have up to zero dependencies, and haven't changed in years; if you consider those stale, I guess nobody can help you.
Also: The utility file will never be updated and fix existing issues within the utility itself (unless you look up the package and diff it yourself). It's a trade-off.
Ok maybe i am missing something, but it seems to me you may want some mutability in a package manager. For example, updates that are simply bug fixes and security fixes but are backward compatible should be substituted for older versions.
Suppose you have a networking library, which is used by several applications. All of a sudden you find a security hole in the networking library. And then you fix it. But the fix does not affect any of the published APIs or functionality and is thus completely backwards compatible for all legal uses.
When you fix the library you release a new version of it. Shouldn't that new version be automatically used for all applications that use the library? If you have an immutable packaging system, all applications will still use the old version of the library and they would have to have their packages explicitly modified to use the new version.
Now this may work for super high value and super secure systems, where someone has the time to individually test the new version of the library with each application that uses it and then individually update each application.
But for the usual desktop system this sounds like a recipe for having a bunch of out-dated hole ridden software, where multiple badly maintained applications still carry known security holes from many years ago; or, in other words, this sounds very much like Windows.
The same way anybody working with any package system does, by pinning ancient versions and/or just never updating all packages, at least for packages that aren't so widely used that they pretty much are never broken
Because nobody got around to fixing it yet. Infrastructure can be built to keep these packages' dependencies up-to-date by default. If they break on dependency update, the maintainer should be alerted, and the package should no longer be recommended until it is fixed. Relying on individual maintainers to do everything correctly themselves has never worked.
Then you get 5 year out of date packages, which eventually have a security vulnerability, and now you have the task of upgrading and working through 5 years of (potentially) breaking changes and deprecations.
It's generally easier in the long run to keep your dependencies up to date. If a package has a new breaking change each week, that's a sign you probably shouldn't be using it for production code.
The problem with that is that many of these projects are production-ready enough and end up being used in pretty big projects because the alternative is starting from scratch or using an inferior solution. Because they're eternally under v1.0, APIs break and so the package never gets updated in production.
Given that no bugs exist, it's not possible to get inconsistencies by installing packages in random order because they declare their dependencies. If no bugs existed we would not need updates at all...
reply