I would say if fares rather poorly when you compare it something like ninja (http://martine.github.com/ninja/) which is made to be generated by tools Gyp or CMake. I have used the CMake+ninja combination on large projects and it lives up the hype.
I used to use tup for small projects but then I switched to CMake/ninja.
I can't say anything against tup other than it is not widely used. In terms of performance, I did not notice any difference between tup and ninja -- but these were small projects.
Yes GN IMO is a much better experience than CMake. Its fully self-contained in a single binary that you can easily just distribute with your code. I have a small python script I use with it to regen files, and then to run Ninja to do the actual builds.
I've been using it for years and have no complaints.
I recently played a bit with ninja on a tiny c project -- as I was already using cmake, the transition was seamless, and even for such a very tiny project the speedup was tangible (but absolutely not relevant in any way, everything was building fast enough :).
Also played a bit with tup, and it's quite nice too.
Ninja is a very specific tool that does only one thing and that only somewhat well enough, though it is fast. Even cmake has trouble employing it for certain generated file cases.
If you measure by "number of users of the resulting binaries", Ninja is primarily used as a backend for Chrome's bespoke build system. But by number of projects, cmake is likely dominant.
(And yes, part of the reason Ninja succeeded is because it ties the hands of the person generating the files. Make gives people enough rope to hang themselves and they frequently do. From the manual: "To restate, Ninja is faster than other build systems because it is painfully simple. You must tell Ninja exactly what to do when you create your project’s .ninja files.")
Ninja is great ! In particular the author's point about iteration time rings really true, if things don't happen in a span pf 2 seconds attention span is lost.
I've been developping a small cmake frontend which uses it by default as well as generally optimizes for build speed - https://github.com/jcelerier/cninja
CMake is pretty horrifically inefficient at compiling WebKit. The makefiles it generates fork to call CMake (to do things like printf in color) far more than they fork to call the compiler. The ninja generator is better, but it has various issues with long command lines on WebKit currently (trunk CMake fixes some of these, but then it doesn't make some directories in the output directory).
Also, the CMakeLists syntax is pretty nasty IMO, but it feels like gyp, cmake and autotools are in some kind of syntax ugliness competition... (Maybe because nobody ever wants to work on a build system, so they'll just do this one little hack which inevitably grows tentacles).
Yes: CMake is a little weird, but it's less bad than all the other options for cross-platform projects, and CMake+Ninja is wonderful. (that said, I do hope something Lua-based like Premake catches on).
I (the author of Ninja) think tup is a fine choice for your project.
Ninja was designed to work within a specific pragmatic context: a very large project (Chrome) that had existing constraints on how the build works. (This design also makes Ninja suitable for use from CMake, which means you can use Ninja to build e.g. LLVM in a manner faster than the existing alternatives.)
The Ninja [1] build system has also been a godsend for my recent projects. I measure significant speedups with Ninja vs. make -j# on my multicore machines. CMake is also able to spit out Ninja build systems, so transitioning to Ninja is essentially free if you are already using CMake, which is quite common for scientific and numerical codebases.
Funny to hear the author of Ninja say that he's never used CMake...
And they shouldn't have been surprised at the number of Ninja users that were on Windows, Ninja is so much faster than the alternatives if you are using CMake on windows.
Which cmake generator was used? The links seem to be dead, so it isn't clear.
I mention this because I'm using cmake for a current project, and backend choice seems to be somewhat important. I found a noticeable improvement when I switched from using a make-based backend (NMake on Windows, GNU Make on OS X and Linux) to the Ninja one (see https://github.com/ninja-build/ninja).
Based on my notes from the time, full build times were basically unaffected on a per-core basisi - but no-op builds were markedly quicker. My notes don't say anything about builds where only one or two files were changed, but my recollection is that that case was noticeably improved as well.
For whatever it's worth, figures from my notepad from a few months ago. Apologies for the scrappiness; these were just brief notes I took at the time, all rather random, just to determine whether the improvement was ~0, or >0... no more than that :/
All times in seconds. All tests performed on my laptop - 3.1GHz dual-core i7.
OS X native/Linux 4-core VM (unsure which - probably Linux VM):
GNU make full = 46.81 (4 cpus)
GNU make no-op = 0.93
ninja full = 49.36 (4 cpus, I suspect - by default Ninja uses all available cpus)
ninja no-op = 0.03
(Other notes: cmake works on Windows, and has a reasonably-sized ecosystem, but... OMG. The scripting language is fucking insane. Still worth a look though.)
Sigh. As far as I can tell ninja is primarily used as a cmake backend. Mostly because the people who wrote cmake never learned to write Makefiles and as a result using ninja with cmake is much faster.
reply