Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

> It's ridiculous to expect newcomers to a language and/or programming in general to slog through a 20 or 30-step environment setup process.

It's ridiculous for anyone really. If you're not using something outside of the standard libraries, it's ludicrous to expect someone to slog through the manual setup process when they can double click an IDE icon and be ready to program in under a minute.

Not to mention the fact that a lot of those setup guides are for specific OSes and the author always seems to leave that part out. So you get to step fifteen and get a weird makefile error only to find out that that particular library is for another OS with no alternative for your own. This forces you to restart at square one with more potential added steps (eg virtual machine setup, OS setup, etc).



sort by: page size:

> The author is lazy, and is channeling his energy into complaining rather than learning.

I don't know about that. The author is certainly confusing programming with setting up a development environment. I can understand why they would be frustrated with the latter, since it takes away with what they are trying to do.

When I started out, the latter was fairly simplistic: computers booted into BASIC. If you needed something more sophisticated, you could buy an IDE where everything would run right off the boot disk or after an automated installation. You can still find those tools, designed to serve everyone from home users to professional developers. That being said, it seems to be much less common.

In some cases, like web development, that makes sense. The target is a standard rather than an operating system or hardware architecture. In other cases it is self-inflicted. When you choose to work with third party libraries, there is a much higher probability that some work will need to be done to integrate it. Then there is the WTF category.

Take one of the easier cases: Java. Even though the development model is much closer to traditional languages like C++, the most common use case will involve installing an IDE and the language separately (possibly with some tweaking of the environment). That's not too bad, but it's likely more than people want to see when they want to get their tools up and running. Contrast that to C++ on a commercial operating system: you install Visual Studio or Xcode and are ready to focus upon programming for the chosen platform.

Of course, C++ isn't always like that and it comes close to representing the other extreme. Setting up development tools for microcontrollers can be quite the task, particularly if you choose the "wrong" one. That's a good part of the reason why novices like Arduino and Platform IO, it makes going from nothing to a functional IDE fairly straightforward.

Now I'm not going to say that the author is right, but I will admit that they have a point. Presenting Unix-isms in a macOS installer is going to rub some people the wrong way. That should have been addressed in a better manner. On the other hand, they were also attempting to accomplish something that is non-trivial to start with.


>Regarding the installers, try this page that makes installing much easier

Ugh. So my choices are: use the installer, or build from source. How Linux of them.

You apparently don't get this, but I don't actually want an installer to make it "easy". I'd rather it be simple. Give me a damned zip file containing the program and its dependencies, and make the tools support relative paths (or environment variables). Is that too much to ask? Yes, apparently.

> Regarding variable declaration, part of the Free Pascal design is that you must declare things first before you can use them. Declaration must be separated from implementation code.

Yes, I know. That's precisely what I don't like about it. It disincentivizes using variables to hold intermediate values for clarity purposes.


> Even grabbing libraries from github, I was unsure if I should grab just the headers and DLLs, or import the entire tree and mashup my build scripts with theirs.

Honestly this is something that apt (et al.) make so easy on Linux distros that when you move to Mac or Windows you just have hard time believing people still in 2018 have to _deal with this shit_ (installing libraries and their headers by downloading each one separately and running an "installer" and trying to figure out where it put everything)

Like, why. It's one of the principle things keeping me away from developing on Windows for the last 10 or 15 years, the absolute lack of standards with regards to dev package management. Every time I decide I need to get something working on Windows it's an incredible pain in the ass compared to Ubuntu. Oh, now go to THIS website and download this installer and run it, now this one, now this one. And figure out for each one where it decided to put the headers. Now copy the DLLs around, etc. (Yes, there is system32 and Program Files but installers don't adhere to this by any means, and discovery via a build system is not given.) It's just barely worth it, and people only put up with it because of the huge user base. Honestly, for small projects I've found it easier to install MingW32 under Linux and cross-compile instead of working on Windows, it's that much of a pain.

At least Mac has Homebrew and Windows has a couple of good solutions now I guess (chocolatey, who came up with that name), but they are not officially supported and that is sad. I understand, they don't want to pay an army of people to package open source software full time, which effectively is what keeps Debian and Fedora going.. but they should. I would say it's one of principle reasons then end up doing something like providing an Ubuntu environment on Windows .. it's not just that people wanted a Unix-like environment, it's that they wanted _package management_.


> It is hard to not see this as an army of advanced beginners unconsciously trying to avoid doing the work required to level up.

I don't think it's hard at all. "I don't want to have to manually manage dependencies" isn't "avoiding leveling up", it's just not doing pointless work that computers are good at doing. Similarly, there's negative value in requiring every/most project to script its own bespoke build system (CMake) when 99% of projects can fit a mold (and then there are efficiencies when 99% of those projects fit that mold--e.g., trivial cross compilation). None of this stuff is meaningfully related to "leveling up". Similarly, bombarding people with language primitives that are almost always footguns (e.g., inheritance) isn't really helping anyone "level up" except to know not to use those features.

In the C++ world, you have an army of people who think they're experts because they've navigated all of these problems to find something that sort of works, but in practice they're far less productive and often don't know what they're missing out on from the rest of the industry. I would rather have people who are productive but don't pretend they're experts.


> When I show a “getting started with makefiles” guide to my documentation friend they look at me like I’m insane.

I am still not going to add another tool that makes my project even more spaghetti than it already is.


> The ugly side of Make is its syntax and complexity; the full manual is a whopping 183 pages. Fortunately, you can ignore most of this

I cannot take this seriously.

I like the concept of make but it doesn't make up for its own warts. Unfortunately, there is no single build tool I can blindly recommend to people without being extremely familiar with their project and how it builds. No solid and lightweight build tool that pleases more or less everyone without having 183 pages worth of manual and a repugnant syntax.

CMake/Lua gives me hope but we're not quite there yet. Make it pretty decent for small projects though... I see it as the HTTP of build tools, though. It has serious issues but when tools come up they are built on top of make because it's ubiquitous.


> I think This is more of a platform gripe ("I hate it that this stuff is unfamiliar to me!") than an actually valid argument

It actually went to great lengths to demonstrate how hard it is to do cross-platform development with .NET. The article describes the problems of having IDEs that rewrite makefiles in incompatible ways.

> The same could be said for someone learning Linux for the first time

Makefiles usually work. You may need to install some library through your package manager or build it yourself, but, then, you are a programmer, not an icon-dragger.

> It boils down to the fact that learning new stuff is hard and inconvenient

While learning new stuff can be hard or inconvenient, it has nothing to do with the problem the article describe. The article describe the problem of not having a clean cross-platform workflow programmers running Visual Studio and Mono Develop can use to seamlessly collaborate on the same codebase, something that's really easy for just about anything else.


> who cares for some extra 5-10 seconds of the "configure" command

For me, it's closer to a minute. "configure" is good enough that it does the job, and it's hard to replace it. "configure" is bad enough that I loathe it with emotions that words cannot describe. It's design is terrible. It's slow. It's opaque and hard to understand. It doesn't understand recursion (module code? pshaw!)

automake is similarly terrible I looked at it 20 years ago, and realized that you could do 110% of what automake does with a simple GNU Makefile. So... that's what I've done.

I used to use libtool and libltdl in FreeRADIUS. They gradually became more pain than they were worth.

libtool is slow and disgusting. Pass "/foo/bar/libbaz.a", and it sometimes magically turns that to "-L/foo/bar -lbaz". Pass "-lbaz", and it sometimes magically turns it into linking against "/foo/bar/libbaz.a".

No, libtool. I know what I'm doing. It shouldn't mangle my build rules!

Couple that with the sheer idiocy of a tool to build C programs which is written in shell script. Really? You couldn't have "configure" assemble "libtool.c" from templates? It would only be 10x faster.

And libltld was just retarded. Depressingly retarded.

I took the effort a few years ago to replace them both. I picked up jlibtool and fixed it. I dumped libltld for just dlopen(). The build for 100K LoC and ~200 files is about 1/4 the time, and most of that is running "configure". Subsequent partial builds are ~2s.

If I every get enough time, I'll replace "configure", too. Many of it's checks are simply unnecessary in 2016. Many of the rest can be templated with simple scripts and GNU makefile rules.

Once that's done, I expect the build to be ~15s start to finish.

The whole debacle around configure / libtool / libltdl shows that terrible software practices aren't new. The whole NPM / left-pad issue is just "configure" writ large.


>I’m not totally sure why some libraries don’t provide a no-build-system version

Yeah, Ive wondered that as well. Whenever I see I have to setup a complicated build system just to run a simple hello world example, I just go yuck and stop


> The official Clojure CLI, for example, is just plain confusing, and that's most people's first impression to the entire ecosystem. The config files, while they use the wonderful `.edn` format, are also not intuitive.

I completely agree; it is unfortunate they don't spend more time officially recommending leiningen.org. A beginner attempting to use the built-in CLI is going to lead to a poor first 3 months.


> Which is easier? Re-running make, or having to write and induce another library to read configuration files?

It's certainly easier for the programmer to just leave some of the work to the compiler, but it's at the cost of being a complete pain in the ass to packagers and users.

Do you even have to do much work yourself? .Xresources is already parsed and loaded for you by xrdb if you can't afford the cost of an extra 100 microseconds doing it yourself. Is the API to interact with that mindblowing horrid or something?

> All of this for values which change once every... how long?

When I'm configuring software to taste, several times a minute. And considering this is likely to be my first exposure to the software, it better not suck completely.


> [...] I can't think of a single situation where "understanding several compilers" would have helped me design/maintain/troubleshoot infrastructure I'm responsible for.

Oh, sure, you don't need to understand how ELF binaries work, until you try to do anything non-trivial to them (building chroot image anyone?). You also don't need to know how Ruby or Python work with modules, but I'll want to stay away from any your system where you happen to install a random recently developed software, because it will be a mess.

> But hey, looks like you're not in my target market, and that's ok!

Of course I'm not. What you proposed is a list for novice sysadmins, except it doesn't touch the essence of the craft, focusing instead on shiny bells and whistles of limited applicability that will be obsolete five years from now.


>For projects of any significant size, you're going to run into some constraint which requires you to use `make` differently than you have before.

That's precisley why official/standard build systems suck, they are extremly cumbersome to wrangle when you go off the beaten path.

So the irony is that standard build systems/package manager are whats good for hello worlds, non-trivial programs require custom build steps.

Just let me write a build.bat/build.sh per platform/compiler/configuration that are explicit and precise in compiler flags, paths, output files, pre/post processing, etc so nothing magical is happening under the hood.


> appreciation for maintaining a large project for the long term.

Nothing author describes is applicable for a large project or a long-term project or both.

He chose one of the worst editors available, and decided to incorporate it into environment setup. Any long-term or large project will have multiple people working on it. A fraction will find the crappy editor the author decided to use useful, but most will want something better. Most people with development experience will want to set up their environment in the way comfortable for them, this setup is asking you to jump through too many hoops, none of which bring any value.

Same goes for CMake -- I've worked on multiple large C projects. None used an off-the-shelf build tool like CMake. These tools are inadequate for large-scale projects. (But, lets give the author credit here, he never claimed that his project was large or long-term). Still, I had never found CMake to be useful, nor for small nor for big projects. Whenever I had to work on a project that used CMake, it was a major pain.

It is common to test stuff in containers during development. But, it's also more common the more lazy and less insightful the programmer is / intends to be. In my experience, better programmers usually set up their environment in such a way that they don't have to deal with the container nonsense, as it gets in the way of debugging and a bunch of other tools useful to interact / understand the program being tested.

So... maybe containers for a smoke test. But, if you plan on going long term... that's just not going to cut it. You need a proper environment where you have comfortable access to your program.


>Wow that sounds like a really frustrating and demotivating experience. Seems totally ridiculous and beyond any realistic expectations

What, the 3 hour compile time? That's an OS+utils, there are projects that take more... Ever tried compiling QT+KDE?


> Also, the maintainers have a super weird aversion to everything that is "too simple". I've seen packages rejected because "the build process was too simple and there are no dependencies". The excuse was that "users can build it themselves". This is not for silly stuff random people wrote on weekend, this is for 20 year old tools used in production written in C.

wtf? What could be the possible motivation for this? Is their build farm really straining or something?

Can you point me to this example?

> I'm also using download links straight from language websites (Rust, Ruby, Python, DotNet, Haskell).

this sounds downright medieval to me, like why even use macOS for development if the software management tools are so bad that that's what you're doing


> I have never spent more than a few hours trying to figure out how to do something in CMake.

A well designed tool would not need a user to spend "a few hours" to learn how to do a specific thing.

People are far too in love with their annoying build systems.

Soon, if not already, tools to build CMakeLists.txt from something simpler will exist. Eventually that will expand to the point it itself requires a tool to generate config files so that it can generate config files for cmake so that cmake can finally fail while generating whatever your compiler wants, and then you have to debug that shit.

I remain to be convinced that build tools are even fundamentally useful.


> Second, once we have done compiling a few times, compiling a program from its latest sources can be easier than figuring out how to install an often older version with our distribution’s package manager.

This is nonsense.


> How well a tool lends itself to be used by the average person is an important property when choosing to use a technology over another.

The reason why most makefiles in open source projects are horrible has nothing to do with the qualities of make, because typically those makefiles are not written by some human who knows about make, but they are generated by some other tools, e.g. autoconf/automake.

Moreover, the generation method typically used is flawed, because it generates the entire makefiles, in an incomprehensible form. The maintenance of those makefiles would have been much easier if makefiles carefully written by a human would have been used, which would have included a file generated by the autoconf tools, and the included file should have contained only definitions, and neither rules nor targets.

Unfortunately there is an ancient tradition established decades ago to generate the makefiles in an overly complicated way and nobody has the courage to change that in any project, because probably nobody understands any more what would happen if changes would be made.

> Can you provide e.g. a project with your good makefiles, that I could use to build debug and release builds, with any of MSVC, GCC, and Clang ?

I never had any need to compile something for MSVC, GCC and Clang.

Nevertheless, many of my projects had to be cross-compiled for various embedded CPU targets.

The way I do that is that I have one makefile that contains only definitions for each project building target, so I could have e.g. 3 makefiles, 1 for a MSVC/Windows target, 1 for a GCC/Linux target and 1 for a Clang/Mac OS target.

Each such makefile will have definitions for the names of all executables that may be needed for bulding a software project, e.g. compilers, assemblers, linkers, librarians, object file converters, copy commands, move commands, rename commands and so on.

It will also have definitions for all the command-line option flags that must be provided to each executable in order to perform whatever tasks are needed for building a software project.

Writing such a makefile for a compilation is a one-time effort, I might write it in an hour or more, while searching through the documentation of various tools, but then I might use it unchanged, during many years, for all projects that I intend to build for that target.

I need to write such makefiles only infrequently, when I begin to use a new CPU, or a new operating system, or a new compiler.

These makefiles with definitions dependent on the compilation target are included with an include directive in the complete makefiles that build the software project.

I build each final file, e.g. executable file, dynamic library or static library in a separate build directory. When I want to build multiple files with a single command, then their build directories are subdirectories of a directory where there is a makefile which will invoke the makefiles in the leaf subdirectories and maybe move or copy the built files somewhere else, if the end result needs them to be in certain relative positions in a directory hierarchy.

The makefile in the build directory of some file has only a few lines. It includes the makefile with the definitions dependent on the compilation target and a makefile that is the same for all my projects, with general definitions, rules and targets.

Besides the include directive, there are a few definitions, the type of file that must be built, the name of the built file (when absent, the current directory name is assumed, with an appropriate extension) a list of directories that must be searched to find the source files (if absent the current directory is assumed) and an optional prefix for the list of directories.

Because make searches itself for source files and automatically generates their dependencies, I do not have to do anything when I add/delete/move/rename source files.

For debug and release I obviously have these 2 make targets in the included makefile with general make rules and targets, which are the same for all projects. The included files per compilation target have definitions for the debug and release command-line flags for all the tools, e.g. compilers, assemblers, linkers.

So if MSVC, GCC and Clang on certain operating systems would be my targets, I would have no problem to build these 3 targets either separately or simultaneously, without having to write a single word in the makefiles of the project. When I would create the build directories for the project, I would copy in each build directory the appropriate template makefile, in which I would change, if needed, only the name of the built file and the name of the directory or directories where the source files are located.

> in my experience make always takes a few seconds for projects with >1k targets when changing a single file. Ninja is consitently instantaneous.

You are right that this is the only case when the tool used to build the project can make a difference.

When a large number of files must be compiled, there is no chance to see any difference due to the speed of the project build tool.

However, when only a single file is recompiled, then there can be a significant difference.

Ninja is very simplified in comparison with make, so I have no doubt that it is faster.

For your example with many thousands of files where you frequently recompile only a couple of files, one could use make + ninja instead of cmake + ninja.

The general rules from my makefiles could be modified to generate ninja input files instead of invoking the build commands,

In that case I would invoke make only after making changes like add/delete/rename source files, and then ninja for the actual recompilation.

Nevertheless, I never had to do this until now, because the speed of make was always acceptable, which might be due to the fact that I have always used fast CPUs with generous quantities of installed memory and with fast SSDs.

Moreover, I usually think a lot before making changes and then I do all of them, so for me it happens very infrequently to need to recompile a single file many times in a row.

In conclusion, I agree with you that there is a use case for ninja, for very large projects where frequent recompilations of only a few files are needed.

Nevertheless, there are a lot of software developers who will never encounter this use case, so for them make is enough.

On the other hand for cmake I am not aware of any useful application, because all the examples that I have seen of cmake projects were not simpler than if those projects would have used make.

It is possible that I have seen only examples where cmake was not used well, but in any case, the best that cmake can hope is to be as easy to use as make.

next

Legal | privacy