Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Well, I'd be talking about things like "I'd like it to be an error if a make rule claims to make a certain dependency and it fails to do so.", like I said in the next sentence.

While we're at it, undefined variables should be errors, not turned into empty strings, and like I said, were I working with it routinely I'm sure I could come up with more. There's a lot more to make than just its shell invocations.

Furthermore, embedding shell's default error handling behavior into make isn't exactly a comforting thing. It's way quirkier than a lot of people understand, and unfortunately it all comes to the surface when people start using make. "It stops at a non-zero exit code!" is, unfortunately, far, far from the simple thing it sounds like.

And encountering the "errors? pshaw, whatever" attitude in multiple languages is precisely why I know it's such a bad idea. Were it just Perl or something, I wouldn't be able to tell if it's a bad idea or if Perl is just a bad implementation, but after the decades I've been using these languages, I've come to the conclusion it's just a bad idea everywhere I encounter it.



sort by: page size:

I don't understand your complaints.

> it's incumbent upon him to clearly state what failures I will avoid

He does exactly that though? Here's a list of some of them:

Rule: "Use a strict Bash mode"

Failure(s) avoided: "your build may keep executing even if there was a failure in one of the targets."

Rule: .ONESHELL

Failure(s) avoided: assignments failing to take effect on subsequent lines ("it lets you do things like loops, variable assignments and so on in bash")

Rule: .DELETE_ON_ERROR

Failure(s) avoided: "ensures the next time you run Make, it’ll properly re-run the failed rule, and guards against broken files"

Rule: MAKEFLAGS += --warn-undefined-variables

Failure(s) avoided: avoids silent misbehavior when a variable doesn't exist ("if you are referring to Make variables that don’t exist, that’s probably wrong and it’s good to get a warning")


My question was aimed at ops generic statement. As for Make, it originally wasn’t clear to me where the issue was supposed to lie. Lines in rule bodies are handed off to the shell, and that whitespace in rule dependencies need escaping didn’t seem surprising since it’s a list (though it’s probably a bug that whitespace in target names must be escaped, since it’s just one token that ends in a colon). But I see now that the expansion of list-valued automatic variables is probably a real Make-endemic issue.

It has idiosyncracies because it's not a general purpose language. Even the things that she mentions are happening for good reasons - like the fact that set -x would break the expected behavior of || and &&.

Actually what language does crash when a function returns false? I mean some throw exceptions but isn't "false" a valid thing to return?

I find the same thing with makefiles - people don't understand what they're doing and expect them to work in a certain way because they haven't ever thought about build systems very deeply. Recursive assignment in Make catches almost everyone out e.g.

FLAGS=-b

COMPILE=compile $(FLAGS)

$(info compile command=$(COMPILE))

FLAGS=-a

myfile:

echo $(COMPILE) $? -o $@

outputs:

t43562@rhodes:~ make -f t.mk

compile command=compile -b

echo compile -a -o myfile

compile -a -o myfile

Despite this, making all assignments immediate to match other programming languages would take a VERY useful tool away. The more you understand these tools the more you know where to bother using them and how much effort to put into it.


Who can't read a Makefile? Who can't at least read the output of make -n? It's terrifying to me that you're suggesting that people can't and don't.

It's not even a security thing. I've had poorly-written Makefiles that would have blown things away thanks to an unset variable on a certain platform, for example.


> Here's an example of a minimal makefile:

Your example does not contradict what I wrote. You manually specified the tool to be run ($CC) and all of the arguments to that tool.

It's true that there is a level of indirection through the $CC variable, but you're still operating at the level of specifying a tool's command-line.

> There's no reason this shouldn't be possible with make; it just hasn't been implemented so.

Make is 44 years old. If it were an easy extension to the existing paradigm, there has been ample time to implement such an extension.

> Do bazel/buck/please actually do this? As far as I know tup is the only tool that actually verifies inputs/outputs of rules, and it needs FUSE to do so.

Bazel certainly has some amount of sandboxing, though I don't think it's quite as complete as what is available internally at Google with Blaze. I haven't used Buck or Please, so I can't speak for them.

> True, it's a bit of a footgun, but by no means difficult.

Well footguns aren't great. :) As just one example, any header that is conditionally included (behind an #ifdef) could cause this cache to be invalidated when CFLAGS change, but Make has no idea of this.


Once you step off the beaten path, you find that errors from things like:

false | true

Get silently swallowed by bash (this is configurable, but the default ignores such errors). Also, the point about not noticing that a rule didn't create its target is a good one. (That behavior should be configurable; I don't think it is.)

Anyway, with -j, make is as async as pretty much anything else out there.


> The completely static nature of a makefile also means it is ill-suited to rapidly evolving codebases where new files come and go almost by the minute as we refactor.

Make is so static that there was a Lisp interpreter written in make. Oh wait...

There are several mechanisms in make to have your rules dynamically adapt to your codebase, e.g. $(shell ...), $(wildcard ...), and pattern rules (and that's not all of them).

You just need to put a little effort to actually read the documentation, not just stop at finishing one of the plethora tutorials that stop after showing variables usage.

> We can do much better than Make.

You mean, "we can do much better than my `make' knowledge". Of course we can.


There's no such thing as an undefined variable in Make. The variable expands to text. Variables which are not defined by definition expand to no text.

To be fair, if you have hand-created many Makefiles, you'd already be quite familiar with that error.

My bugbear with make (and even the better remake) is finding out why things are not happening. "No rule to make target" is such an uninformative error message. Could it not identify rules which would make the target were they not failing?

> The syntax for make files and target deps are also very complex "Do I want to use % or $ or @?"

How is this any more complex than any programming language? Take javascript. Do I want to use =, ==, or ===?

> I cannot test if my Makefile will run on a developer's mac if I don't have a mac myself and even then it's a crap shoot (do they have homebrew? is their PATH correct? etc).

Containers suffer from these kinds of problems as well. For example, if your ip tables are not set up just so, you get no network access from inside your container.


Yuck. What's the point of a Makefile if every single rule is phony?

> Make leans heavily on the shell, and in the shell spaces matter. Hence, the default behavior in Make of using tabs forces you to mix tabs and spaces, and that leads to readability issues.

I have written a great many makefiles, simple and complex. I can’t recall a single time I’ve needed to mix tabs and spaces in one (though I have had to mix them multiple times in both YAML and HTML).

(As for anything like accidental mixing, for my part I have a sanely-configured text editor and so don’t need to worry about anything silly like tabs being turned into spaces. Tabs are superior to spaces anyway. ?But I do use spaces for Rust and Python where that is customary, I’m not completely antisocial.)

> .ONESHELL ensures each Make recipe is ran as one single shell session, rather than one new shell per line. This both - in my opinion - is more intuitive, and it lets you do things like loops, variable assignments and so on in bash.

.ONESHELL also means that your makefile will behave differently from how anyone that’s familiar with makefiles will expect it to. But I guess this does explain why you went enabling strict mode, since you’ve basically turned off the near-equivalent default functionality from Make.

Note also that you can do loops and such already—you just need to use line continuations (put backslashes at the end of each line, which Make will consume).

Yeah, the default behaviour is idiosyncratic and will lead to surprises in the unwary (though they’ll normally observe it immediately, when the cd is ineffective on the next line, or when the if/for causes a syntax error), but I think Make has generally become niche enough that I’d prefer to pander to people that know Make than normal people. :-)

> .DELETE_ON_ERROR

Two-edged sword: it also means you can’t inspect what went wrong by looking at the file. You’re also making the very dubious assumption that merely deleting this one file will fix everything. A few times when I’ve known something to be fallible but want to be able to inspect what it created, I’ve put in something like a `… || { touch --date=@0 $@; exit 1; }` suffix so it still fails, but first zeroes its mtime so that subsequent runs will see that it’s out of date, though the file still exists.

I’m not saying it’s wrong or a bad idea, just that it’s worth considering the implications fully rather than blindly applying it.


It's the exact same argument that comes up whenever gcc improves their optimization algorithms by exploiting undefined behavior, making some code no longer work. In both cases, the original code was fundamentally broken from the start, and the change in tooling only revealed the brokenness, not causing it.

I would completely see such makefiles as being broken.


I guess I should've been more clear. My point is other people /using/ your Makefile, not editing it. You know what you're running. You don't know what everyone else is running. I've singled out ONESHELL because it will be silently ignored by non-GNU make, and it will behave ever so slightly different enough to cause extremely hard to find bugs.

Example:

  .ONESHELL:

  sometarget: someinput
  ?BASEDIR=other_expression
  ?SHELLVAR=complicated_expression
  ?rm -rf $$BASEDIR/$$SHELLVAR
(obviously extreme to illustrate the point, but you get the idea.)

Or the fact that $FOO interprets as $(F)OO without the slightest warning. And of course if you're in a script line, you probably meant $$FOO..

Make certainly has some obscure variables, but of all the basic knowledge of Make you need to learn, $@ is near the top of the list (it's "target". an @ sign looks kind of like a bullseye. If you want to see it as visiting a dependency graph, it's the dependency you're currently "at").


Because it seems easy. And easy things done "wrong" require re-invention.

There should be a way to $FOO. Well, there's a way to do it in a Makefile but figuring that out is harder than should be, and doesn't make sense when you finally do figure it out.

"Well that way is stupid. Here's how I'd do it:"


I always see others complain about Makefile syntax, especially the whitespace, but that is, to me, one of the smallest problems with make. For myself, the top problems are:

- No tracking of changes to build rules. The build rule of a target should be one of its dependencies.

- No protection from dirty builds. Your sources root is also your output root, and make does not help you keep it clean.

- No options for extending the build to use alternate up-to-date checks (e.g. not mtimes), caches, execution mechanisms, etc.

These all have workarounds, but they require the developer to manually include the workarounds in every build rule.


> some of my Make variables are referenced with $(VAR), and some with $$(VAR), depending on whether I want them to grab the CURRENT version of the variable or the calculated version.

Hah, my latest Makefile work has been a set of functions which generate Make-syntax output, which then gets $(eval)ed. I hear you on the debugging nightmare that this can be: does a given variable get resolved when the function is first $(call)ed, when the block gets $(eval)ed, or when the recipe is invoked? But IMHO it's not too bad to do printf-style debugging. Replace $(eval $(call ...)) with $(error $(call ...)), then work backwards from there.

It also helps to be very disciplined about immediate assignment (`var := stmt`) and to always use recipe-local variables, rather than global variables.

I do feel like all of this aspect would be cleaner in Python or Lua... but the problem is, the _rest_ of the build, which more people interact with on a daily basis, gets more complex when that happens. Because there are always the ancillary targets and recipes where normal Makefile syntax works just fine.

Thanks for the NDK reference, I'm interested in seeing other "ugly" Makefile support infrastructure for comparison :)

next

Legal | privacy