Once upon a time I thought that makefiles were a cool idea. Okay, this was the early 80s, rocks were still young, and I didn’t have a version of make on any of the platforms I was using, so I wrote one. My own version of make wasn’t very good, but it was simple, did what I needed at the time, and I gave it away for free (you can probably still find it on the net. One of the reasons that Richard Stallman doesn’t like me much is that it’s close to my total contribution to Free Software. Trust me, it’s not worth the hunt).
Fast-forward a few decades and I’m wrasslin’ with makefiles large enough to have detectable gravitational pull, with dizzying levels of nested includes, wrapper programs bolting together metric buttloads of definitions from auxiliary files that were first cut in clay tablets Hammurabi’s scribes, macro systems from hell that wrap back on themselves through higher dimensions to form legal XML, and default rules that actually reach back in time and break builds that have already succeeded (which sure explains a lot, doesn’t it?).
And yet, with these hundreds of thousands of lines of intricate and fragile declarations accreted over uncountable hardscrabble engineer-years, with with the multi-hour-turnaround time, the only friend at my back is an ECHO statement that lets me go back and stick tracing statements where I think the problem might have been. I don’t need ECHO, I need a time machine.
What I’d actually like to have is a fapping debugger, but I suspect it’s easier to build a gizmo to tear apart and reconstruct the elementary fabric of the universe than it is to interrogate the infernal interiors of NMAKE after things have gone sour. (Yes, NMAKE. Don’t get all superior on me: I’ve used GNU make as well, and while GNU make is better, this is still like expressing a preference for a particular brand of cyanide in your coffee).
A person from outside the culture of modern software engineering would respond with something like “Pull the other one,” or more likely, “Stop whining,” and that person would, sadly, be right on the money in both cases. We have no one to blame but ourselves. With all the fancy languages we employ, all the type-safety and exception safety and interface meta-languages and theorem proving, why are we jack-legging software together with rubber bands and duct tape?
Make was written in 1977, and it hasn’t fundamentally improved in decades. Instead of improvements, we got features: Powerful macro expander syntax, looping constructs, electric and non-electric variable definitions, some lame attempts at parallelism, but all of these extras just added complexity and didn’t address the everyday problem of figuring out why a build is failing two hours in.
If I were to tell someone “I’m going to design a programming language that’s going to be used by millions of programmers every day: It’s going to have tons of hidden state, no obvious control flow, obscure and terse syntax, and programs written in it are going to run for upwards of six hours before bombing with an error message like ‘File not found’ — oh, and I’m not going to write a debugger, and all the state will be hidden and completely lost when things go wrong” — I’d be strung up in the stairwell alongside the guy who invented trigraphs.
Don’t get me started on autoconf. (Someone else wrote a nice flame; there have been others). Tools like this just paper over what’s really wrong: We have too much crap and we have to build it all the time.
Make is only part of the problem. Modern compilers are still rooted in the smelly primordial ooze of the paper tape era of computing. Well, maybe magnetic tape.
Imagine I’m building a house; to achieve this, I will be nailing some boards together. Given that I am a relatively savvy and modern software engineer, what I do is:
1. Grow a tree
2. Cut it down and drag it out of the woods behind my ox
3. Extract a board, using a pit and a great big bloody saw
4. Dry the board out in a kiln
5. Cut and plane the board to proper dimensions
6. Repeat 1-5 with another tree, resulting in another board
(I’ve omitted steps involving mining iron ore, making coke, refining the ore, smelting same, making steel, and pounding out a nail)
7. Nail the stupid boards together. Oh, you wanted glue? I don’t do glue; there are good reasons for that.
… and in about forty thousand years that house is finally assembled (which is about par for how late a lot of software projects are). This is pretty much the life-cycle of a compiler: Suck in several megabytes of header files and/or precompiled headers, process a miserable handful of ten or fifteen functions and methods, spew out some object code and a fuck-ton of debugging info, then do the whole thing over again with the next set of sources. After all that’s done, you feed it to a linker. (Don’t get me started on linkers; I did a lot of work on linkers in the 80s and 90s, talk about a thankless job…).
Of course, modern build systems get rid of some of the duplication of effort here, since they will precompile headers for you and do some dependency analysis. But I dare you to change one common structure, or touch one common header file containing, say, a list of error codes. It’s time to recompile the world; see you in a few hours.
C and C++ need a module system so badly that we should pretty much stop adding features (yes, Mr. Freaky Template Thingy I’ll Never Use in a Responsible Real-World Project, I’m looking at you) and do nothing else to these languages until this is fixed. Architecturally. We need to ban #include (and no, precompiled headers are not the answer) and get a type definition and importing system that actually fucking works and that scales to tens of millions of lines of code. Once we have that, I’ll hunt down every single use of #include and #if/else/endif and club them to death.
Something absolutely magical happens when you have turnaround time that is less than about five seconds. It almost doesn’t matter what language you are programming in. If you’ve got a system that gives you five seconds from source change to running code, it’s possible you’ll forget to eat and starve at the keyboard, even if you’re hacking away in assembly.
Build times sneak up on you. Pretty much every project I’ve worked on from scratch has gone from that magical “seconds” window to “minutes” (tolerable), to ten minutes (get coffee), and somehow reaches 45 minutes to an hour (go to lunch, surf the web, do email, write documentation, attend meetings, play video games). Around the two or three hour mark and you’re talking about doing SCA re-enactments in the hallways using parts from build servers as props.
Frankly I don’t see this problem being solved any time soon, at least for the kind of dead-bits “EXE” development that happens in embedded work and high-performance cores of video games or operating systems. While it may be possible for hardware to get to the point where we can JIT and message-pass ourselves to Nirvana and forget about cache line awareness and punt global optimization, weren’t we saying that ten years ago, too?
The essential core of makefiles and text-based includes were 70s-era hacks of convenience that went only so far. Speed of turnaround is a language feature, and you don’t have to be a dope-addled Smalltalk hacker to appreciate the beauty of dropping into the debugger, changing the structure of a structure, and continuing blithely along as if nothing extraordinary had happened. We’ll never be there with C (at least, a language that supports that probably doesn’t look very much like C), but it’s interesting to contemplate.