Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

> In a sense, programming is all about what your program should do in the first place. The “how” question is just the “what”, moved down the chain of abstractions until it ends up where a computer can understand it, and at that point, the three words “multichannel audio support” have become those 9,000 lines that describe in perfect detail what's going on.

This closing paragraph is excellent.

While I do wonder if the author's progress along these wouldn't have been accelerated if more of this evolution took place before the first line of code, the overall message here is very true and very well put.

When I made this realization for myself, it was a major turning point in the quality, speed, maintainability, and usability of the code I produce, especially when I can successfully define the final code structure to directly reflect this progressive refinement from intent to implementation.



sort by: page size:

> we continually add features in the name of "flexibility" until we have an unwieldy mess requiring HTML, CSS, Javascript, a framework, and various middleware pieces to make a site. We make a new library to abstract away the complexity, but the cycle repeats and that library is too complex, so a new library is created on top of it to simplify.

Well put.

That cycle of layering (abstraction, addition, complexity and then another layer of abstraction) has been going on since we invented the programmable computer. Machine language got us started but it was a bit on the hard side to put it mildly, so we came up with assembly language. That was fine on simple chips with little instruction sets, but then features were added and platforms multiplied, so we abstracted away the complexity with C. Then we started to solve problems for end users, so we added text- and graphics-mode libraries and the whole thing grew again. Then graphical OS's came along and suddenly object orientation and event handling made a lot of sense and we evolved various languages to do that, including C++. All built on the layer below and the layer below and so on; underneath it all we're directing electrons with little gates made from transistors.

The point is, everyone at any stage of the evolution of computer programming has wished for what you want: a simpler, more expressive language with less 'busy work' (for want of a better term.) Someone creates it, we're happy for a bit and we make amazing things with our new toys, then the pain starts again as the more visionary people among us (like your good self) start to think about what problems they could solve if only they didn't have so much dreadful housekeeping to do and all this stuff to wrangle.

Think about what your world - the web - is built on!


>I've been writing software almost non-stop for close to 25 years, and I still don't understand most of the code written by others.

I think you nailed it. In my experience, the increase in complexity isn't necessarily inherent to the amount of code so much as the amount of abstraction used as code size increases.

I've worked on some large codebases that only used simple, well-understood and/or well-documented abstractions which didn't feel as nearly complex as other codebases with abstractions that were more complicated than they were worth.


> Imagine that the requirements are that your program does A, B, C, D and E, in that order.

I find that every year I write software, the less my software looks like "do step A, then B, then ...". It's always becoming more functional and declarative. I'm not sure there's any function in my entire program that has to do 5 high-level calls in order like that.

Without hearing what A/B/C/D/E actually are, this sounds almost straw-man-ish, or perhaps architecture-astronaut-ish.


> I think the promise here is the ability to code in a more conceptual way with less fiddling with the finicky details.

This is basically how product managers code. Or former engineers turned engineering managers. Or even team leads. Hell, maybe like an architect?

You come up with a rough sketch, design the system, think through a couple edge cases, tell the computer what you need, and the computer figures out the details for you. Similar to being a high level engineer that designs/defines/codes the broad strokes of something and then lets the lower level minions handle details.

We made a similar leap when compilers were invented.


> - Programming simply became more complex. It used to take one line of code to do something. Between the OS and languages requiring more (initialization, boilerplate code, etc.)

It's important to acknowledge that we have done this to ourselves. The amount of job-justifying unnecessary complexity found in today's programming environments make me wonder how the field hasn't yet toppled over itself


> it is inevitable that things like that will be written. And why not?

We can't just blame the programming language for programmers not understanding how computers work.

I think as computers have gotten faster, and languages higher level, we've stopped talking about computers as mechanical devices. And this is a really important perspective to have.

Can you answer these questions about your program?

- How big is your binary / JS bundle? What parts take up most of the space?

- When your program runs, what does the computer spend most of its time executing? What parts of your program are the slowest, and why?

- How big is the memory footprint? Which parts of your program use the most RAM?

For binary programs (like rust / Go / C), which patterns are easier or harder for the optimizer?

If there are two ways to design your code, how do you discover which approach will run faster?

This stuff shouldn't be considered advanced concepts. An architect understands the building they've designed. A chef knows what their food tastes like. When you program, you're making a thing. You should understand what you made and how it will be executed on the computer.


>The challenge wasn't overwhelming complexity, as it is today. The challenge was cramming your ideas into machines so slow, so limited that most ideas didn't fit.

I like this line right here. It does seem like we've piled on abstraction after abstraction in these days. Sure this does make things easier, but I think things have gotten so complex that it's much harder to have a complete mental model of what your code is actually doing than in the simpler machines of the past.


> If you wanted to make a program, what you did was start by writing some code from scratch. You used about 8 concepts—strings, integers, floats, variables, conditions, loops, functions, modules2—to make your thing, like baking bread from raw ingredients.

Somehow key part of IO is missing in this description. For your system to work you have to fit into API/constrains of the system within which yours runs.

> Are large language models gonna cause programmers to lose their jobs? Not anymore than StackOverflow did, in my view. However, it’s going to change them…somewhat.

Disagree.

That will holf true till someone invents 8 concepts using which one can create a whole app just from it's conceptual description. That person would not loose their job, those 9 devs who would have to implement that app otherwise without this appeoach - will.

I forcesee future where 1 team with the right tools replaces today's 10 people. So some will loose their jobs, some will secure them more.

> But like an IDE, or a framework, or a test harness, utility here requires skill on the part of the operator—and not just ChatGPT jockeying skill: programming skill. Existing subject matter expertise.

Agree. You can't estimate quality of the response of the LLM if you don't speak the language they reply at least (or understand nets of concepts hiding behind the words).

---

Also I did not like this passage. The book to my experience is quite useful and is not worthy mentioning in such context.

> First of all, that culture has its infiltrators. We got self-styled prophets writing whole books about their personal philosophies and slapping general-purpose-sounding names on them like “Clean Code.” It reminds me of the dudes who ran around the Middle East 2,000 years ago claiming they personally could introduce you to G-d.

Looks like it's personal attitude towards the author or their work.

---

Other than that it's good solid writing. I enjoyed it.


>the first thing I want to know is what it actually does, step by step, at low level

I feel like we might be touching on some core differences between the top-down guys and the bottom-up guys. When I read low level code, what I'm trying to do is figure out what this code accomplishes, distinct from "what it's doing". Once I figure it out and can sum up its purpose in a short slogan, I mentally paper over that section with the slogan. Essentially I am reconstructing the higher level narrative from the low level code.

And this is precisely why I advocate for more abstractions with names that describe its behavior; if the structure and the naming of the code provide me with these purposeful slogans for units of work, that's a massive win in terms of effort to comprehend the code. I wonder if how the bottom-up guys understand code is substantially different? Does your mental model of code resolve to "purposeful slogans" as stand-ins for low level code, or does your mental model mostly hold on to the low level detail even when reasoning about the high level?


> As I prepared my presentation, I found myself falling into my usual pattern when trying to really understand a piece of code—in order to grok it I have to essentially rewrite it. I’ll start by renaming a few things so they make more sense to me and then I’ll move things around to suit my ideas about how to organize code. Pretty soon I’ll have gotten deep into the abstractions (or lack thereof) of the code and will start making bigger changes to the structure of the code. Once I’ve completely rewritten the thing I usually understand it pretty well and can even go back to the original and understand it too. I have always felt kind of bad about this approach to code reading but it's the only thing that's ever worked for me.

This has also been my experience over the last thirty-odd years, whenever I have to take over maintenance of someone else's code.


> I used to be excited by programming language features instead of what problem I was actually trying to solve with programming. I'd spend hours condensing 10 lines of perfectly working code into 1 line of the most concise text possible (...) they should be impressed by what the program does for them, not what language features you used to implement it.

Interestingly, what made me go through similar evolution was the very language in which I was trying to do all those things, namely Scala. After a few years of trying to be "smart", I realized that the problem was usually bigger than the language.

So perhaps, it wasn't Go, nor Scala, who helped us in our realization, but life and experience?


> It's amazing how much structure you can actually give yourself with modern tools.

Like a compiler? >:)


> In my opinion mainstream software will be abstracted away to the point where you are working on defining and refining requirements. And will likely look very different.

In some sense, the same thing could have been said 50 years ago, and the prediction came true. Compare what a typical programmer writes on a typical day, then and now, to the instructions that the machine executes as a result, then and now. I'm with Dijkstra [0] (HN thread [1]), in thinking that we'll always need something more formal (simpler) than natural language to express computation, even if it's only to ask things in a way SkyNet will respond to nicely.

[0] http://www.cs.utexas.edu/users/EWD/transcriptions/EWD06xx/EW...

[1] https://news.ycombinator.com/item?id=24529900


> I mention all this to point out how intrigued I am by what looks like the emphasis that was placed back then on clarity from the human’s perspective, be it programmer, user or anyone else involved in the software’s development and use.

I'm not yet born in the 1970s, but I think the fact that "code is written for people to read" is something that people in the earlier decades knew intimately, and that we have "forgotten" in recent years.

It hinges on a very simple fact - we know that machine code is for machines to execute, and they had machine code for as long as machines existed. And yet in subsequent decades, people spent so much resources in designing/inventing programming languages, so much resources in writing interpreters and compilers -- it's got to be for a good reason, right?

Once you think of it that way, the reason is obvious. Code written in programming languages are for people to read (and write), otherwise we'd just work on the binary executable.

I guess these days in the stacks of abstraction, we don't even know what's running on the bare metal any more, and for novices it might feel like "abstractions all the way down", and the point that the abstractions were originally for human consumption might have been lost or forgotten.


> You can go all the way from adding some lightweight formal specifications a posteriori with e.g. a fancy type system, to developing all your code formally by starting with some axioms.

This statement particularly interested me. If I wanted to learn more about this, what resources do you recommend?

If you ask how deep I want to go, let's say we have a scale of abstraction that's from 1 to 10, where 1 is a two paragraph executive summary and 10 is an entire Ph.D's worth of training. I'd probably be looking at a 2.5 to a 5.


> I have found the whole Elm experience quite mindblowing; if it compiles, then it just seems to work.

My mindblowing experience was facilitated by referential transparency where I had a page of code that became half a page after extracting some functions and restructuring the code and then have that half page collapse into several lines of boring code because I realized that the functionality can be re-expressed in terms of standard functions.

It is amazing to be able to think about an entire branch of code in isolation and be able to understand it in its entirety.


> Most programmers are complexity junkies getting paid to feed their habit.

Maybe more so today, but historically I've come to appreciate programmers as megalomaniacal control freaks - myself included, which programming allows them to exercise within the harmless confines of an imaginary world hosted by microprocessors and the attached storage.

It has been evolving though. Computers became networked, now software controls much of the real world, it's far less harmless than before. We also used to strive for minimizing complexity, I'm not sure how much of that was intentional or just a consequence of limited space while RAM was so expensive.


>It took a bit, but eventually I came to understand that he is so good at reading code that he just wants it all laid out in front of him as densely as possible. He can effectively run it in his head, as long as he can see it.

In some ways, I also benefit from this. If I am working on the same code every day, I typically have it all loaded in my head anyway. If I am coming to a codebase for the first time, or coming back to it after working on something else for a long time, it is very helpful to have as much of the code as possible fit in one screen.

That said, there are tradeoffs for sure. Whitespace is not much help for me, but clearly named variables (e.g. `area` instead of `a`) are essential. Then again, there are some short variables that are very common and clear to me in specific patterns, for example in `for i, length in enumerate(boards)` I have no problem realizing immediately that `i` is short for `index` where ever I see it in the for loop body.

Three specific pieces of advice here:

1. Use an autoformatter when you need to read code to convert it to your preferred style. Don't enforce that on the team without consensus though.

2. Everyone who can afford it should buy larger monitors. 4k 32" will fit a lot of lines of code, even if it's not quite as dense.

3. Don't stress about style consistency too hard, but do attempt to match the existing style in any given project.


>> I'd much rather be able to write 10,000 lines of code that can do what your million lines of code does.

Sure, who wouldn't. Unfortunately this is my hypothetical and I get to control what I mean by it. The million lines of code in my hypothetical is good quality, maintainable with reasonable density.

>> Better programming languages, libraries, and other abstractions are what we need.

In the entire history of languages, we've only managed about a 10X improvement via these mechanisms (that is being charitable probably). Several important things are still written in C which would mostly be recognizable to a programmer from 40 years ago. There are still problems to solve but I feel we are on the asymptotic section of the curve in this regard.

next

Legal | privacy