>Between Hackers, Masters of Doom, and other rad 90s hacker-coder media, software development really seemed like a much more awesome career than it turned out to be.
I think it comes down to the fact that our industry has become rigid and beholden to the university education system. John Carmack and John Romero were both college dropouts. Their stories would probably be impossible today. What we have now is a world of people coloring between the lines and going straight from one set of rules to another. There truly is no more punk rock left in tech.
> There are simply a lot of uninspired developers who are just in it for the money now.
Personally I am happy that the industry is maturing to the point where people can just enter it as a career and not a ~~passion~~
This mentality that software devs should have loaded up github repos with side projects and live & breath code all the time is super toxic for the industry.
By the way there are plenty of talented coders who only code at work.
And there are likely plenty of hacks who code all the time, but never learned how to code well, or work with a team or understand how to scale a project or any host of other skills that are necessary for building commercial products.
> I sincerely feel bad for people who have to stay in it.
I'm in my 50s, and I've been getting paid to build software for over 27 years now, and started programming on a near daily basis on a TRS-80 in the 1970s.
Please don't feel sorry for me. I still very much enjoy building software.
I won't necessarily disagree with the particular challenges you cite.
The total complexity is enormous now, but the tools and abstractions, which themselves have variable quality, and bring their own complexities, are tremendously powerful, and, though it often doesn't feel like it, effective.
Beyond the specifics, I acknowledge that 'the industry' has evolved in many ways and directions, not all of which are positive.
I face non-technical challenges today that, had I known about them 30 years ago, would have probably caused me to switch careers.
So, I acknowledge what you're saying, and to it I will only add this: each of us exerts a great deal of control over how we perceive the world around us. It's possible that the difference in the way you and I 'grade' the software industry is that I am, for no knowable or particular reason, more fundamentally optimistic about things than you. It's also possible that, primarily for reasons beyond our control, I've ended up working in more positive organizations.
To other people reading this far: two specific anecdotes about how the software industry has changed over three decades provides very little concrete indication for how your personal experiences are likely to go.
> programming as a job has become very laborious and boring, processes driven and needlessly complex
It's been mostly my experience, yes. I think it's the inevitable outcome of droves of people getting into software companies because of the salaries, and pushing mostly unnecessary processes and red tape onto devs. These days I barely have freedom to touch a staging DB, and I don't even work in a large org.
>>the fact that he's coding since he was a kid and spent hours upon hours in front of a screen and keyboard writing code.
Stop being impressed by people who do lots of work. For every John Carmack working hours everyday and winning, there are thousands who've worked equally hard if not more and failed. Some people just get lucky, almost everybody else doesn't.
Don't be impressed by survivorship bias. If you had fun playing games and hanging out with your friends, it was time well spent.
If you want to work hard and build software you have whole life ahead of you and you can do that as well. But as a middle aged programmer, if I were starting out, I would put my health, relationships and retirement on high priority than putting Lines of Code as a measure of success.
> What does make me lose sleep some nights is what would happen if software development simply went away. It sounds ridiculous, but it’s something that I think about from time to time. Industrial workers in the 20th century probably never imagined being replaced by robots, but it’s happened on a large scale.
The technology to render your programming skills obsolete is right around the corner.
> Is this indicative of a cultural shift in the industry?
Yes I think the industry is changing.
In my anecdotal experience, the quality of programmers has taken a nose dive. I think part of the reason is that it's been a lucrative industry for long enough that parents have had time to coach their children into the industry.
In the early-mid 2000s (when I cut my teeth) and especially in the 80s/90s (from what I hear) you came across more "hacker" types that were doing this work for the love of doing it. Yea you still had the "Initech" type companies that would outsource/etc but cutting edge programming work was a lot easier to find.
These days I'm seeing more and more people that treat programming as a "job", not a "passion".
Obviously I think this is a good thing for people, in general.
But I also think that, while the potential for software is at its highest, the actual relative quality of software is at its absolute lowest. This is due in large part to the bad quality of software engineering these days. Again, very anecdotal.
> I want to work with more people who LOVE software and find the development of machines and the code that runs on them as fascinating as I do. Unfortunately, its less and less these days.
While I like software I'm beginning to see it as a rather bad career. While you make more money early in the career, you stop doing so rather fast and other jobs catch up quite quickly and surpass you.
I guess there's always the exception company, etc. but as a non-US career, it can get taxing to see de-growth.
> There is a world out there about compilers, run-times, drivers, emulators, VMs and OSes that I can't seem to grasp, it is just too complicated for me.
If you've already determined that you can't do something because it's just too complicated, there's a strong chance you'll be proven right. In my experience, all of the things we software people deal with down to networking, hardware drivers, task scheduling, you name it, were created and explained by humans. Nature constrains us with time and space performance and quantum mechanics. If another human was smart enough to create this stuff ex nihilo, certainly with some work you can be smart enough to understand it.
> My Computer Science degree and the newer degrees do not focus on these, they have become 4 year long coding boot camps that focus on getting people job ready with AWS, React, Ruby or whatever trendy. This is true for the majority of the not so prestigious universities out there.
No one is going to give you a checklist for how to be creative or pursue your own goals. You have to do it. If you're capable of asking this on HN, I think you're capable of setting a few goals. Just write them down and put the deadline a long ways away.
> I see a pattern and I feel the older generation is way more capable and knowledgeable when it comes to Computer Science in general. I am pretty sure a lot of people are or were in the same boat as me. I don't really have the time(Have wife & kids) to take a compiler course or set out to build my own OS, what would be the easiest way to learn these lower level things?
I'm in my thirties. When I was in school, there were a lot of kids who were just punching the requirements and doing what they were told to get by. Any time they got stuck, they'd have to ask the professor or TA in office hours. Others of us learned that we could solve a lot of problems ourselves if we cut them down to their essences and wrote small programs to see what was actually going on. That heuristic was very helpful.
Once I really did get stuck after doing this, and went to office hours. My professor (an old school Unix graybeard) was so thankful that I had put in the leg work of examining the problem with "toy programs" as he called them that he gave me an exceptionally detailed explanation for what was going on, and how I could show this to myself. He made the comment that a lack of curiosity was a problem among my peers, but that it can be grown and cultivated like other skills.
One other thing the older generations did better than we do is look at the history of our field. If you don't know who Alan Kay, Doug Engelbart, Nikolaus Wirth, Alan Perlis, Ivan Sutherland, Leslie Lamport, Margaret Hamilton or Ted Nelson are, go find some information about them (most have talks or demos on YouTube). Then find out the rest of the missing pantheon of computing. Our industry goes through fads in sort of a fashion culture, and in my experience it leads people to feel burned out our stuck exactly as you described. There are some enduring truths which our field does a poor job of passing onto posterity. Turns out more people had interesting ideas beyond Dijkstra and Turing.
> 1. Reading the Dragon book(I find this would be useful only once you have built a bad compiler) 2. Paying people older than 40-45 to teach me some C/C++ and some tricks of the trade (expensive but I gained a huge amount of knowledge in a short time)
My advice: pick a small, interesting problem and tackle it for some set amount of time each day, maybe after the kids are in bed or your wife is enjoying a bubble bath or whatever. It doesn't have to be long - maybe a half hour to an hour. Just do it consistently.
I've found that GitHub, YouTube and HN itself have loads of interesting project ideas and tutorials. I find those much better than just reading a book. In those cases, the book becomes supplementary material, and is usually most useful at exploring theoretical issues, edge cases, history and other more detailed topics.
If you are going to read, always do the exercises. Also, keep in mind that a lot of the books recommended like the Dragon Book or SICP or whatever else are typically a very thorough treatment, but may not be great for a great read. (There are exceptions to this: I find Tanenbaum's OS books to be highly engaging, interesting and helpful. He is both a practitioner and a theoretician, so I find his perspective valuable.)
Not only is my understanding better with experience (rather than just reading), but I find that I feel accomplished, which is a virtuous cycle. And I also find that I learn enough to be able to discover other interesting things for myself.
> Two, the software talent market is bifurcated. There is basically commodity development of crud apps, and technically complex novel development.
I’ve been making this point for years. I think it’s telling that other nearby disciplines are bifurcated. Electrical engineers vs electricians. Architects vs builders. Etc. The virtues of a good electrician are that they’re reliable, have good customer service and they work efficiently. A good building company can bring a building up in time and in budget. But beautiful architecture work doesn’t share those values. The best architecture is bold and creative, and still has people talking years later.
I think this split already exists in programming. We just don’t admit it to ourselves. It’s different work inventing React vs using it to make a website. Inventing react, or rust, or redis requires considered boldness. It takes time to nurture the right insights to make it right. In contrast, the virtues of a good web consultancy team look more like the virtues of an electrician: Good customer service. Clear estimations. Work delivered on time & on budget.
But we’re so enamoured with the idea of ourselves as “elite hackers” that clock in / clock out programmers can’t give up the mantle. I get it. But it’s stupid. There’s no point evaluating candidates on their ability to reimplement a btree from scratch if their job is to style buttons. You’re evaluating people for a different job. Test their software estimation skills. Their social skills. Ask about workflow and productivity.
Essays like this one ask “where did all the hackers go?”. They’re still around, they’re just harder to find now. They went into weird crypto projects. Llama.cpp. Obsessive database projects like scylladb. They’re inventing rust, and finding security vulnerabilities in io_uring for that sweet Google bug bounty money. They’re in the demoscene or making programmatic art for burning man.
Do you need real hackers at your company? It depends. Are you making an apartment building (on time, on budget) or the Sydney Opera House - which is iconic now, but was almost never finished because of delays and creative disagreements? They aren’t the same kind of work.
> They keep encountering highly flawed websites or web apps and are assuming that regular web devs can do something about it.
This is the reality that has been grinding me down of late. Nor do I find it limited to only web stuff. I see it in embedded and other market areas.
I was excited over the last few decades to see the field of programmers swell. I have always loved the craft of good software development. I was so excited to see so many others also get into it and naively believed that many more people participating would lead to a larger pool of ideas to draw from and everyone would get on board in some sort of utopian democratic industry wide kumbaya. During the rise of open source, I even felt it was really happening.
> Right now coders are both masons(brick layers) & architects
Not sure how common this is today, but I do remember the time when software development was done by people writing hundreds of pages of specifications of various level of detail (including class diagrams, method names etc) and then delegating the menial work to very junior folks. In many cases I witnessed true contempt for the mere job of transcribing programs.
I'm not talking of some dark age of punchcards; but clearly learning the boring details of the Java SDK and figuring out how to setup your Eclipse project (let alone using CVS/SVN) was deemed to be below the pay level of many.
The quality of the software produced in those environments was abysmal. I'm not talking about specs on general, but the practice of making the mistake of overspecifying to the extent you think you need a quasi-mechanical transcription (an undergrad student or a very junior dev) but at the same time not really specifying everything and thus le
>However, some people find computers frustrating, stupid or boring. Such a person probably wouldn't enjoy a job working with them all the time - and it's difficult to build the skills that let you advance your career if you're averse to or bored by practising them.
It's funny really because I thought I was perfect for software development as a teenager. Already knew how to program, spent a huge amount of my free time on a computer, good at maths and sciences in general, etc.
Once I actually got into software development I found the whole thing rather irksome. The company I worked for was a great place to work full of really lovely people, free lunches, paid internships (yay NZ), etc. But software development? Nope. Awful industry. Computers are shit.
As soon as I got out of software, I immediately found my interest in programming as a hobby come back. Unsurprisingly people don't really like doing something as a hobby when they're paid to do it for 40 hours per week already.
> the actual work of being a developer/programmer in a professional setting is a really shitty job and sad life
The thing is, it doesn't have to be and, until relatively recently, honestly wasn't. It's a shitty job and a sad life because we have open offices, ticket-tracking systems and daily standups. There was a time when programming was exciting and rewarding.
They are still around, I'm in a community with a few of them. They are just quietly working on small relatively unknown hobby projects in their spare time, while simply putting in their hours to get their paycheck.
The big companies have so much capital that they can very easily capture a market and get mind share, which the small hacker types building actually good and useful software have zero chance at getting anywhere.
Software companies are controlled by MBA types, and as such the hacker types get very little input into the development and direction of software. Having to build software for the sole purpose of making the shareholders more money is demoralizing. Hacker types want to make cool useful software that benefits users, which unfortunately isn't very profitable.
Modern programming culture is cult-like in its fad chasing and full of "Best Practices" grifters. There is a ton of useless cruft and friction in software development, however if you're not chasing the latest fads and not building software according to the latest religious writings, you'll get excommunicated.
They're still around, they've just been pushed out in the name of shareholder value and "Best Practices".
> I think lots of devs feel weird about coding for 20 or less hours a week because it’s the part we associate with work when in reality we’re delivering real business value in those stupid meetings, it just manifests outside our view.
Blink twice if you’re saying this under duress, because otherwise, this just sounds like post hoc justification from someone who gave up and dedicated themselves to promulgating the soul-sucking management games that made producing real value untenable.
Spending less than 20 hours a week actually writing software is a horrendous use of a developer’s time. It’s a marvel when anything intelligent gets written at all in such an environment.
> Software is one of the best careers in the world.
In theory it is. In practice...well, let's ask game developers for example. Or other groups who end up with a real-world job they end up hating.
> Maybe they are getting lost along the way earlier in the pipeline
Quite likely.
> because tech is pseudonomyous with the the kind of people who like to talk about things like DeepCreamPy. Maybe they walk into that freshman CS class and see 30 other dudes that fit this profile and give them creepy unwanted attention and they drop it and switch to biology.
Yeah, that's not happening. First, because for example in my country there's no such thing as "switching to biology", so you can't lose them this way. And yet, there's something like 7% of women in my country's informatics-related university programmes. Furthermore, those 7% are 7% of applicants, so there's no "maybe they [walked] into that freshman CS class" because those 7% haven't even seen that freshman class yet. So while you're quite likely correct in that "they are getting lost along the way earlier in the pipeline", it's definitely WAY earlier than you postulate in at least some quite large populations in the world.
> Taking a step back, I personally question why being able to code should be considered important?
Ability to code gives you a particular mindset that makes it much easier for you to work with computer technologies. It's helpful, but on the other hand you could argue that all other technologies we use in daily life have been sufficiently abstracted away that you can get away with not knowing how they work.
The real reason why coding is a big topic in society is that, for the past two or three decades, programming has been an easy way to move up one or two socioeconomical classes. Most well-paying fields are nowhere as easy in terms of workload or required education. Of course, if everyone became a programmer, the payoff wouldn't be as high anymore, but there's lots of money to be made right now by pointing to college dropouts made tech billionaires, or fat FAANG paychecks, and selling people coding bootcamps.
(Or, in less cynical terms: the demand for software developers still outpaces supply; this manifests as a background social pressure to create more software developers.)
> during the 90s the idea seemed to be that that the next step after the outsourcing was actually no programmers at all
Lest someone think this is an exaggeration, no, this was literally the conversation at the highest levels of some companies. A relative told me to reconsider going after a comp sci degree in 1995 because within ten years the computers would be programming themselves. There were presentations at their workplace about it.
My first "real" job was programming customizations hacked on top of a 4gl tool - injecting JavaScript into pages through an improperly sanitized <label> widget! After the whole application was rewritten in the 4gl tool, it never lived up to its promise.
I think it comes down to the fact that our industry has become rigid and beholden to the university education system. John Carmack and John Romero were both college dropouts. Their stories would probably be impossible today. What we have now is a world of people coloring between the lines and going straight from one set of rules to another. There truly is no more punk rock left in tech.
reply