No snark at all. I'm sorry you read my comment that way. What I got from this release announcement was a feeling of coming full circle, back to an older time when configuring a system meant compiling with different compile-time options (possibly with compilation happening on another system) as opposed to using something more complex like compiling once and reading config options from an rc file. (I was reminded if this also by dwm, which too is configured at compile rather than runtime.)
It's hard to criticize their design decisions at the time. For instance, "m4" and "sh" were likely chosen because they exist on all systems, and "perl" and "python" were not yet ubiquitous.
I have never seen "configure" work particularly badly, and I've had to build some pretty hairy, dependency-ridden crap on Linux, Solaris and Mac OS X. It has held up remarkably well, and many projects are good about providing useful "configure" options. Usually fixing something requires an environment variable or configure option, and not a makefile hack.
There is no question the generated code is practically indecipherable. But you have to treat it like a compiler; no one goes leafing through the ".o" that GCC generates on a regular basis, so why worry about what "autoconf" and "automake" produce? Like any widely-used tool (such as a compiler), you can put a fair amount of trust in the maintainers to not make anything too broken; and if you see a problem, you can file a bug report.
It is also somewhat reassuring that even if the generated result is wrong, it is at least theoretically possible to fix it; just try fixing something that goes wrong in Visual Studio.
I will allow that "m4" is showing its age, and these days it's a heck of a lot easier to Google some examples and adapt them, than it is to figure out how to add a custom rule from scratch. The big challenge for the GNU Build System is to create a modern version that makes new assumptions (e.g. "perl" always exists), and update their methods accordingly.
It does seem kind of odd to criticize the release for having -Werror on by default and also for having a fallback if /dev/urandom is unavailable.
In the former case they are sacrificing portability for increased confidence of correctness, and in the latter they are sacrificing confidence or correctness for increased portability.
...yes? Since when is configure+make ever just configure+make? Even today I had to do a configure+make that involved another 20 min of debugging to figure out that I had to make a symlink to hack gcc's broken platform naming conventions. Like I get that some people enjoy that, but I personally don't. The reason I write software is to let myself and other people avoid having to jump through annoying, arcane hoops. Software should be accessible and user-friendly even to the most naive user. It is not a badge of honor to be comfortable with a highly finicky, complex system that requires extra time that could be spent doing other things that you'd prefer to do instead.
People have been complaining for years that Microsoft's stuff isn't configurable enough for power users/developers. Now they're following VSCode's lead and offering "infinite configurability" that's also wrong because it is "too hard!" This would almost be funny if it weren't so obnoxious.
Now, I will fully cop to out-dated docs being annoying, particularly when most of the configuration isn't really obvious or self-documenting. But complaining about Microsoft of all people offering highly flexible text-based configurations is hugely ironic to me.
It's a matter of degree, not kind. It's much more of a hassle to compile a custom version of the program than to modify the default configuration, but the point is that in both cases there's a default that does something that the user doesn't want.
"How much easier does autoconf make things, really?"
Speaking as someone who was a system administrator responsible for compiling a large number of packages across a wide variety of platforms (let's see, SunOS, Solaris, AIX, Irix, Linux, and HP-UX, all at different versions at different times) through the '90s and early 2000s, autoconf made things much, much easier.
It doesn't help the developers any; each individual difference has to be identified by the developer. (My biggest complaint isn't that it's written in m4, but that there's no comprehensive list of what I need to do with it. That information makes up the m4 tests, but there was no documentation.)
It seems aimed at making the trivial use cases easy, the normal use cases hard, and the hard use cases impossible.
For example, "exe", "dll" and "lib" are types of projects, not of individual files: no apparent way to have multiple and heterogeneous end products (e.g dynamic libraries + static libraries + test executables + something not C/C++ like Doxygen) or to define variants like compiling the same program as a statically linked big executable or as a thin executable and a bundle of dynamic libraries.
Lack of documentation is another serious problem. For example, how do I pass exotic GCC flags and how can I ensure that qb uses GCC instead of MSVC?
"Zero configuration" in serious systems means self-configuration, e.g. automatically finding and inspecting available compilers, to provide added value over completely manual configuration; denying configuration, or at least not thinking of it because no toy example has needed it so far, isn't very productive for the end user.
Even though I know it'd be better, I can't rationalize the time investment needed to port the subset of distribution that I use into a standalone config. Despite being very opinionated, I think spacemacs gets many things right by default, for example everything being evilified, linting and code completion working out of the box, the popup that shows all possible next code sequences, etc.
I do wish there was something more lightweight, but until I'm annoyed enough about that, I'll stick with the current state.
Hey! I remember that interaction (and sorry for my crass language, I was having a hard time by then).
Your explanation was certainly satisfactory. Now I understand a bit the motivation of people who want to compile different programs depending on what happens to be installed at a particular moment in their computers. I still think that it is "an exceptionally bad idea which could only have originated in California" ;)
You use the word "just" as if this were a simple thing to understand and know. Is this magic incantation communicates in a way that the usual people who compile this package can understand to use it without being intimately familiar with the code?
I ask not rhetorically, just out of curiosity. It's not a typical type of suggestion for other packages I've encountered with open source, and it feels far more difficult to discover than typical decisions at compile time.
I'd rather have people complaining about having to run configure a bunch of times to disable several features they don't have the libraries for than complaining that a feature doesn't work (because it ended up being disabled without them knowing).
Likewise, I'd rather distros figure out the hard way when a new release has a new feature and needs a new dependency rather than their users complain that a new feature is missing.
> Give people a screwdriver and they'll find a way of using it as a hammer.
I feel like feature flags aren't that far off though. They're fantastic for many uses of runtime configuration as mentioned in another comment.
There's multiple people in this thread complaining about "abuse" of feature flags but no one has been able to voice why it's abuse instead of just use beyond esoteric dogma.
FWIW, the 19000 lines of "configure" for Varnish also check whether stdlib.h exists. Perhaps it's still useful today to do so in order to avoid obscure compilation issues or to catch problems on misconfigured systems early on?
As an old-timer with ~30 years of programming experience, I have similar sentiments as the author about complex projects today, yet I also often feel that too much knowledge, accumulated in sometimes cumbersome form, is being thrown away and reinvented badly. There has to be a compromise somewhere and it's no surprise that projects in an old language like C, running on evolved systems like Unix, de facto standardized on Autoconf to make it a little easier for developers. Do I want to use it myself? Certainly not, I have the luxury of being able to choose a modern language that abstracts most (not all!) platform-specific issues away at compiler installation time, at the cost of having much fewer deployment options for my code.
reply