Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

I’ve been on Debian since Potato, so I totally see what you’re saying. But...

> that's cutting maintainers out of the loop

Is this necessarily a bad thing? The market has seen the need to fill a hole, and it seems to be working.

I first started with Slackware, and dependency nightmares is what got me into Debian in the first place. Although Debian is nice because of its slow and stable base (which makes me happy for production), I’ve recently moved to Arch and have been so happy as it’s brought back Slackware’s idea of getting as close to upstream as possible and it handles dependencies! And to be honest, I’m loving it. And as an even added bonus, I’m getting more and more surprised how a tonne of packages that I’ve installed are Rust apps.

So, coming back to your comment:

> that's cutting maintainers out of the loop

With systems like Arch that get us closer and closer to upstream, are maintainers the unnecessary middlemen? Of course they’re not entirely redundant, but maybe a new model of distros like Arch will be more commonplace in the future



sort by: page size:

Debian doesn't really work too well when you need multiple or new versions of software, or need something custom.

It's either spend a week wrapping your head around all the different ways of getting something to spit out a conforming .deb, or falling back to something like RVM + RubyGems / PIP.

The maintainer community is also reflexively defensive when they've broken something. The problem is never Debian breaking things into little unusable pieces, it's always upstream for not foreseeing how their software would be bastardized on Debian.


Debian's a bit funny. Maintainers tinker with packages more often than I'd like; they make changes to packages that already work perfectly well, and sometimes breakage occurs.

This is partly because of the autonomy that Debian package maintainers enjoy. I have slightly mixed feelings about that - but only slightly. I'd sooner have maintainer autonomy, and seriously-distributed decision making, than an overlord.


FWIW, Debian's package manager (and its concepts of package repositories and dependency resolution) hasn't changed much since I tried it in 2001 -- and probably even earlier than that. Slackware and Red Hat were my first forays into Linux, and the dependency hell you speak of was enough to drive me to FreeBSD and OpenBSD in the late 1990s.

If your marriage is ending over a toilet paper roll... it's not actually about the toilet paper roll.

It's the same deal with Debian and systemd. It's not the init system. It's the thing it represents. Having to either adopt systemd or run GNOME in an unsupported configuration seems like a clear-cut choice (which is why systemd is now the Debian default), but having a major upstream force you to significantly rearchitect the distribution seems like a pretty significant loss of control. This is at the same time as other upstreams have gradually ripped control out of the hands of Debian developers in other ways: Firefox, librsvg, and python-cryptography adopting Rust suddenly made it a lot harder to support a bunch of niche CPU architectures. Speaking of Rust, and also Go, they use static linking and have their own library package managers, which both makes it harder for Debian to package and simultaneously easier for users to install a binary directly provided by upstream. And languages that don't have static linking support can always use Docker.

Honesty, I wouldn't want to be a Debian developer right now. Red Hat has tried to reinvent themselves in a world of containers, but since Debian is a volunteer organization and not a corporation, it's a lot harder to pivot (attempting to pivot would probably cause all their existing volunteers to quit, while failing to attract new ones). What sort of future does Debian even have, if they have no decision-making power over the core OS, and all the applications route around them?


I don't want debian to be like the android app store, where there are thousands apps that work badly and overflow me with ads.

I much prefer the f-droid model, of having curated repositories to keep crap outside.

Also, I can't understand why people on the internet think that upstream developers are omniscient. They make lots of mistakes and errors. Distribute maintainers fix a lot of things, and send the fixes to the authors.


Engineering tradeoffs are always tradeoffs. One is not strictly better than the other from the user's perspective.

Mandating shared dependencies means that Debian is often running software against a dependency version that the original author did not develop against or test against. Sometimes the Debian package is effectively a fork. This results in Debian-specific bugs which get reported upstream. Distribution-specific bugs are a crappy experience for upstream developers because it wastes their time, and it's a crappy experience for users to be told that their software cannot be supported upstream because it's a fork.

Maintaining a huge repository of forked software is also an enormous undertaking. It's common for Debian users to be running fairly old versions of software. This is also not ideal, particularly for desktop users who read upstream documentation and require support when entire features are missing from their antique Debian version.


I used to use debian. But their bureaucratic self righteous organization have -at my own personal- opinion created a drift between them and upstream packages.

Countless decisions of debian maintainers (that are proficient at packaging and NOT at coding) to think of themselves has «smarter» than upstream open source software maintainers have resulted in countless teeth grinding:

- the openSSL randomness «fix» that resulted in openSSH being shipped with only 65535 potential keys

- the complexity of building clean src vs bin packages (in opposition with RPM or slackware packaging) and the numerous kombinat called debian helpers makes packaging a hell,

- latex packages being a tad broken

- ruby/python packages requiring a tad of contorsion to have them work the way they were inteded to work natively (the overzealous package slicing which in python/ruby required you to install non trivial package to use gem/pip)

- the multiplication of packages for «ease of use» that cluttered debian with so many fixed dependencies hell that it makes stable often hard and slow to upgrade vs testing that can break and the hell of version pinpointing

- the bureaucratic approach of splitting configurations in so many directories that it is mind blowing and as usual not always following upstream simpler conventions

- poor default config (like apache cgi-bin being global to help install 3rd party modules)

- and the debian community above all that has taken «the melon» and kind of been evolving to be a tad overconfident leading to impose choices that are more than disputable to the users (such of course as systemd) but let's say that ubuntu having sucked most of their community of maintainer during the split it is even more obvious that they have a «microsoft» syndrome of we know better than you what is good for you (desktop choices...)

As a result, I have almost happily left debian as my main OS, but I still am hating that «securing/hardening» an OS after default install has become the norm in free/open source main distribution.

The more I have seen open source project take a turn of «sectarism» the less I am convinced in the so called intrinsic values of openness in free software.

Yet, there are still enough valuable open source projects out there for my comfort


Yeah, well, rightfully so. I used to love Debian and I've grown more and more disillusioned by it. Their main problem is two fold: 1) They mess with the packages they distribute. Like, a lot, rather than trust upstream. And 2) they don't update things that seriously need to get updated.

Example 1: Using System Python on Debian is a notoriously terrible experience for example, because they split up the venv module from Python itself and distribute it separately. I've had so many issues because of that and there's no good reason to do it. It's part of the stdlib, it doesn't need a separate package.

Example 2: Debian Jessie shipped with the ancient pip 1.5.6 (https://packages.debian.org/jessie/python-pip - upstream is at 9.0.1 since 2016) which never got updated, just stayed stuck there, wtf? Such old versions of pip are lacking support for a bunch of things that python packages today use on setup, so it would install things in a weird way and then packages would be broken, users wouldn't understand why. One of the many cases of not giving users access to a more recent version of a package, causing the user's experience to be severely impacted (and of course, they won't know to blame debian or the pip version in this case).

Edit: Incidentally, that last bit is why I always, always do `pip install --upgrade pip wheel setuptools` whenever I create a new virtual environment to make sure I don't get issues installing packages.


The burden is entirely Debian's choice. They want to mess around backporting fixes. They could just package upstream's releases, but that would break their choice of stability model. The downsides of that choice of model is nobody's fault but their own.

Whereas I just left Debian-based distros because I'm hitting that exact problem with backported patches not arriving in any reasonable length of time in many packages. There seems to be a real manpower problem over at Debian in terms of maintainers.

I used Slackware for several years as my main desktop machine. I never ran into dependency issues. You have to understand that the Debian et al approach of having dozens of small packages for everything you want to install is not universal.

I have often read "I don't have the time for Slackware." In all honesty, I spent way less time on the OS with Slackware. The quality was much higher, there was never a need to wait months for a bug fix because someone forgot to link something in, and it was really easy to build my own packages.

I eventually left because slackbuilds.org was not being run very well and I couldn't get the packages I needed. But the dependency issue was not a concern at all.


I'd still be on Debian if it wasn't for the old packages and having to run backports

The whole maintainer model exists in Debian because Debian tries to do its own maintenance on packages, even if the original developers are unwilling to accept the fix back.

Chromium and Firefox are sort of monster code bases, though, it’s not surprising to me that distros are reluctant to change it in key ways.


Debian has a terrible track record. Just look at the OpenSSL/Valgrind disaster. As a former upstream developer myself (on the Wine project), all Linux distros found unique ways to mangle and break our software but Debian and derived distros were by far the worst. We simply refused to do tech support for users who had installed Wine from their distribution the level of brokenness was so high.

You may feel that developers are some kind of loose cannons who don't care about quality and Debian is some kind of gold standard. From the other side of the fence, we do care about the quality of our software and Debian is a disaster zone in which people without sufficient competence routinely patch packages and break them. I specifically ask people not to package my software these days to avoid being sucked back into that world.

As a sysadmin you shouldn't even be running Maven. It's a build tool. The moment you're running it you're being a developer, not a sysadmin. If there are bugs or deficiencies in the software you're trying to run go talk to upstream and get them fixed, don't blame the build tool for not being Debian enough.


Why? (Asking as someone running Arch on many servers for the last 5 years, and debian server for much longer)

I can think of a few things I dislike, but all of the failures so far were self-inflicted. Like forgetting to update packages that I self compiled outside of supported repository.

It's not completely autonomous when it comes to upgrades, you have to think for a while before hitting "yes" after seeing the list of packages to update, but so is not Debian in the long run... (some of the servers I have to manage are 13 years old or so, and going through major dist-upgrades is never that pleasant either. It's a bit more hassle, because I don't trust the major version upgrade so I have to run them on a backup VM first just to see whether some issues will crop up).

But having the latest versions of the programs is great, I don't have to second guess myself when writing new programs (will it be compatible?), can use the latest kernel APIs, etc.


Right, and therein lies the rub. Unless Debian wants to "boil the ocean" and burn a ludicrous amount of effort on repackaging every last npm and pip package for Debian, that position seems unsustainable long-term.

I'm of the opinion that Debian should create more distribution channels (that are available by default) where they are not the maintainers, and old releases are not forced to stay patched in order to remain installable.


No I don't. I use Debian derivatives because I can apt get all my stuff without thinking hard, because those package maintainers have done the hard work.

A bit, but I assume it’s mostly just the extra friction to get things in that acts as the main filter. Malware has made it in to Debian before. See the xscreensaver time bomb.

The extra friction also makes Linux package managers useless. Just about 0% of the things people are installing with npm exist in distro repos. Distro repos are also extremely poor at keeping multiple versions around at once.

Surprisingly people don’t like having their react upgrade tied to the Debian major version update.


Hey pabs,

I don't mean to knock the Debian maintainers. I have chosen to rely on Debian for at least a couple of decades. I was only bitching because Debian is still my friend (I bitch at my girlfriend too).

next

Legal | privacy