Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

> GNU doesn’t strike me as an organisation which places a lot of value on software quality and reliability.

This stems from a misunderstanding of what GNU is. Ideally all projects under the GNU umbrella would work towards a unified operating system, but that's sadly not what it's like. The GNU project is not much of a project in the common sense of the word, nor is it much of an organisation. This is one of the many reasons why https://gnu.tools exists, a subset of GNU with common goals and values, including collaborative project management.



view as:

It sounds silly for deep theoricians and a somewhat politically active organization like GNU to have wanted to go all the way to an OS, the kind of software where you really need to throw all the beauty out of the window and solve dumb users tangible problems, like how to read a bluray from an obscure proprietary drive or how to run a game with a monopolistic corporation's binary drivers.

GNU can never succeed at it because it can never compromise and maybe that's fine. It's a shame that it s considered a failure when they accomplished so much in the OS scaffolding part.


I kind of get that. And if a change in GNU glibc caused a bug in GNU wget, I would hope that wget would release a fix relatively quickly, but I wouldn't necessarily have any special expectations just because glibc and wget are both part of GNU.

But I wish that the projects in the GNU toolchain would at the very least collaborate. The combination of glibc + gcc + autotools + m4 makes such a core part of any GNU system that you'd hope that GNU cares to keep them mutually compatible. So when they release an update to glibc which breaks m4 which makes it impossible to compile the current version of the GNU toolchain using the current version of the GNU toolchain, that's extremely disappointing.


It does naively seem like a central library like glibc would want to run regression integration tests on every new version to make sure they're not breaking the utterly massive ocean of software that depends on them, and that the very obvious first step in that direction would be to run tests using software from the same group and/or core libraries/software that they know are required for a lot of base systems to work (ex. GNU coreutils, GCC, clang, busybox). I suspect it does in fact mostly boil down to resourcing problems - although that just begs further questions, since I would expect that organizations like Google or Red Hat would consider that kind of thing important enough to support.

Using GNU Guix (the distribution) should be a logical and simple test-bed to see how exactly glibc/gcc pre-release would affect downstream packages.

Legal | privacy