Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

You might be surprised how many people don’t understand the bit level basics these days. They’re not really the focus anymore, and they probably shouldn’t be. The point of advancing technology is to push the mundane, low level, difficulties away to make bigger concepts/abstractions easier to piece together and mentally bear.

From what I’ve seen with most recent grads, the education is shifting more and more towards algorithms, with experience mostly involving the use of existing libraries/frameworks, rather than lower level implementations that us “old timers” were forced to implement ourselves, thanks to lack of accessibility to freely usable code. I think GitHub, StackOverflow, and Google have changed the mental model of software development, significantly. I don’t think that’s a bad thing at all since it should free up some beans, especially for someone new to the field.

Not knowing this will bite you eventually, but it’s fairly trivial to work out.



sort by: page size:

I think there is one interesting angle to this problem.

I am someone who grew up with the technology, as the levels of abstractions were being added. I am now benefiting from all those accumulated decades of knowledge.

As the IT / development world was changing, I had enormous privilege and comfort to learn the things at the pace they were happening. Being able to assimilate changes over long decades. Be a witness to the problems and logic behind all those new solutions. Understand how we come to have JavaScript and the browser mess we are in and so many other curious features of todays digital world.

I understand pretty much all of the layers of the computing from how CPUs achieve some of the things they are doing to bus protocols, to instructions, physical memory, low level OS internals, high level OS internals, virtual memory, userspace platform communication with OS, programming language runtimes and linking, shared libraries, IPC, networking, virtualization, etc.

The issue, as with any automation, is that new players on the scene (younger devs, devops, etc.) simply have no chance to learn the same things and go trough the same path.

For them, spending a decade working with a low level programming language before you jump into high level programming language is simply not an option.

We, people who really understand the technology that the world runs on, are a slowly dying breed. We are still here as tech leads, managers, directors, business owners. But there will be a point in time when we will go on retirement and there will be only precious few people who had perseverance to really understand all those things by diving into obscure, historical manuals.


Honestly, maybe a majority of modern engineers not being familiar with anything but the pointy tip of the stack is a good thing.

I come very much from the old world - I learned to code on graph paper, as that was how you saved your work, and being able to wield a scope and iron was kinda mandatory for being able to meaningfully use a computer.

As tech grew up, so did I - and while it’s useful to be able to understand what is happening from clicking a button in a gui down to electrons tunnelling through a bandgap, particularly when it comes to debugging the truly arcane, I actually find that the level of abstraction I carry around in my head sometimes gets in the way.

I look at newer techies bouncing around with their lofty pyramids of containerised cloud based abstracted infrastructures, and I almost envy the superficiality of their engagement - I can’t help but look at it and see immense complexity, because I see behind the simple and intuitive userland, and that makes me run for vim and some nice bare metal where I know what my hardware is doing.

Maybe I’m just getting old.


I agree and disagree here. Certainly not old enough to be able to input on the wisdom/experience aspect yet, but from what I have seen in my time in this industry, investing your time in what is current is a surefire way to get left behind. The technology that sticks over the years are the things other people would rather take shortcuts to use, or wrappers upon wrappers upon wrappers.

With every abstraction away from the core computing libraries, you take a bigger risk and enter into a bigger gamble with your time.

I certainly know I am keeping miles away from node and webassembly (although the latter is interesting), simply because I was under a guy who has focussed his efforts not on acquiring fancy knowledge, but rather using simple techniques to achieve complex results.

Soon the very simple things I learned to appreciate, ended up being things I now use every single day to do stuff way more complicated than a person of my intellect should be able to accomplish.

Anyways, this got kind of sidetracked on the way, but I have to work with colleagues who have cemented years of knowledge in archaic tech, that while current at the time, ended up fading away over the years. They still have knowledge and skills, but they let their experience shackle their thinking, and oftentimes come up with extremely complicated solutions for very simple problems.

I think this is also the point in their lives where they got comfortable enough to think they don't need additional training, where I was very much taught that there is no limit to knowledge that can be acquired.


The problem is, when everybody thinks like this, it stays like this. With rising age People tend to change into people jobs if they can because somehow the air gets thinner as there are not that many advanced software jobs - although there certainly are a lot and even more places that do appreciate advanced abilities. On the other hand a lot of skills/experiences are transferable. E.g. some weird DB connection keep alive bug might be analogue to debug on C++, node.js and Rust. Developers with not enough experience might give up, change DBMS, refactor the whole DB layer (hoping it's MVC'ish) or even give up on the project if it's not an important one.

Also most new tech builds on the learnings of the old tech and discards outdated patterns that were possible with the existing stacks, but advanced users might have stopped using those anyways. For example Java has no multiple (class) inheritance in contrast to the much older C++.


There really is nothing new under the sun. Below the furiously churning surface of the programming world are calm waters barely disturbed in the past 20+ years. And it's not all that far below, either. I've come to resist learning new techs not because it is hard for me at my age, but because it is too easy, and it is easy to get sucked into spending all my time learning the latest churny surface gloss on old ideas without ever exploiting a particular tech to do something useful because I'm moving on to the next churny surface gloss on the same ideas. Gotta actually build something at some point.

I agree with the core point about experience remaining relevant, but I think you underestimate the changes that got us here. For example, we've learnt a lot about jits in the last 20 years. The fact that we can write interpreted code that can achieve near compiled speeds in certain situations is amazing.

Speed of interpreted Javascript has improved by two orders of magnitude on the same hardware in the last ten years, and you don't see technological advance? I can't think of another field that has advanced so quickly.

I don't want to diss other fields, but an awful lot of the most valuable improvements in infrastructure in the last ten years have been about standardization and bringing technology that existed years ago to the masses.

On top of that, the inconvenience of the Web platform is real but also overhyped. Imagine you're writing a network app in another platform. It's extremely unlikely you have access to an integrated network analyzer as good as chromes. How many other systems allow you to completely modify the look of your application while it's running just to see how it looks by playing with the developer tools? Having a repl that allows you to interact with the running system has been standard on the Web platform for ever.

I'm not even sure what you're looking for. What would constitute new technology if 100x speed ups don't? Almost all of software is implied in the concept of the Turing machine so complaining that you can't achieve anything you couldn't have in the past with large, expensive, proprietary systems and specialized knowledge seems unfair. It's been true since Babbage at least.


There is a lot of hard-won knowledge that is aging out of the profession increasingly lately, I have been observing. It's tempting to blame web development, and its proverbial low bar to entry, but I think the real problem is computer UX getting too good, such that you can accomplish almost anything one would desire with a computer, without having to delve into the nasty implementation details.

I am just barely starting to get grey in my beard (largely from dealing with frighteningly incompetent consulting firms, rather than age...) but I can remember the eldritch incantations dealing with autoexec.bat and DOS memory modes, or the clusterfuck of trying to get printers or new bits of hardware to work, or the panic of trying to fix BSODs when I'd trashed the system installing something dodgy from LimeWire or the shovelware bin at WalMart. The next generation coming through has been shielded from these horrors, and mostly matured in an environment where computers work reliably; and when they do fail, it is usually opaque, inscrutable, and largely hidden from their eyes. Aside from a crash-course in the scientific method for diagnosing and debugging issues, the old dodgy software world exposed one rather harshly to many of the underlying realities of the system, and our current software environments are still mostly built on those foundations, with a few dozen layers of lipstick applied to the pig.

It certainly doesn't help that most instruction in software engineering either hews to the abstract and theoretical or the novel, with passing consideration of the practical realities and the history of the art. Ultimately, we write code that runs on silicon transistors, not ideal Turing machines, and in a great many fields we are retreading extensively explored ground, a hamster wheel of innovation. Every generation seems to have to need to have a go at yet another build system or object database, or rediscover the model-view-controller pattern. We delight in making endless new and exciting and broken wheels, in shameful ignorance of the hard-won lessons of the past.


Unfortunately we are shovelling so much in at the top of the stack these days, that while the foundations change much more slowly, there is just so much more to learn. And let's face it, low level or systems programming is just not as attractive and "shiny" as the latest web tech fad or creating a AI/crypto web startup, so I can see why the younger generations go for that instead of the foundational stuff, even if it's dissapointing.

Some of us old-timers are very happy digging around in the basement levels while the web stack grows ever upward.


Exactly this. I have ~ 35 years of experience, but I'm always trying to keep up with advances in algorithms and data structures. Frameworks and languages is mostly just rearranging and renaming the deck chairs -- there's hardly anything _actually_ new. Taking a positive spin on it, if reintroducing good old concepts as new makes them more adopted, maybe that's a good thing. Most serious people will eventually recognize the precedence and make the link.

Actually it is harder to learn certain types of programming now than ever before due to the proliferation of incidental complexity and tooling. Take web development frontend for example. Back in the 90s, you could get away with knowing basic HTML and a bit of JavaScript becaus CSS didnt even exist yet. Frameworks were certainly not a thing.

Other forms of systems programming that involve concurrency etc are also newer.

So is cloud. The list goes on...

Every generation faces issues and programming IS hard. It is unfair and condescending to write this office as a millennial concern.


You're only seeing a part of the development task. Yes, the knowledge requirements for doing a task X (let's say a given CRUD app) are declining. But:

- How do you guarantee that all those pieces you just glued together will work with high availability? That they'll feel easy to use and consistent if exposed to the end user, or easy to maintain (future developers) and operate? Quite some work is often needed to have all that.

- User expectations are getting higher. You could get away with a certain quality and usability of software in 2005 that you couldn't today. Had a desktop software in 2000? In 2006 it was a desktop software and a website, now it is a web app, an iphone app, an android app...

- What you are describing is putting together a solution. Might make sense in many end user scenarios, web apps, etc. For things like automotive and other, low level programming is still needed - though abstraction is also slowly making its way in there also.


The kind of knowledge that gets obsolete is replaced by stuff that's very similar.

Doesn't matter if it is a new language, framework or whatnot. There have been very few advances in our field, stuff changes incrementally. So you have seen everything before, it just has a different twist. You can pick this crap up in a fraction of time, as the concepts are already internalized.

... Unless, that is, if you used to call yourself "X developer", for whatever value of X. Then you are unidimensional, and screwed.


Agreed. However, perceptions change as technology improves and overall the arc of software tools has been towards higher level abstractions unless absolutely required to go deeper.

The issue is that we're not that new anymore, people have been doing software engineering for at least half a century.

This is true. Everything new technology seems like an evolutionary improvement on one or a few aspects of programming, like separation of concerns, expressiveness, correctness, etc. None of these things should surprise anyone unless they've been stuck in one of the 1990s 3.5GLs and never learned anything else.

I've been doing a lot of tech blogging lately, playing around with various F# tools and toy projects, seeing what resonates with the community.

There was a progressive complexity that happened back in the late 70s and early 80s such that people alive today who still code and learned back then have taken the ride from machine language to multi-gigabyte stacks.

We just kept adding stuff and having to make sure we could be functional in all of it. Not an expert, but functional. It was standard practice on my commercial programming teams to decide what everybody wanted to learn on a new project before starting. (And these were high-paying projects. We always left with happy customers).

People were jack-of-all-trades. Most everybody was. You had to be.

What am I seeing resonate, at least as far as I can tell? The inability to understand what the hell is going on and work with it. You get a C++ compiler compiling a hellacious codebase working in DOS, a rails configuration ain't nothing.

I see what are supposedly senior programmers walk a bit off the happy path on a framework and they're lost. Not only are they lost, they are insecure, afraid, embarrassed. There's nothing wrong with these people. There's something wrong with the way we're training and staffing them.

Fifteen years ago I was still coding commercially, having a blast. Talking to a recruiter one day about various projects, she said "You know, you're one of the last true general consultants"

There may be ten thousand of us. Beats me. But her general appraisal was correct. There is a drastic and complete change between the way coders used to relate to technology and the way they do today. It's not tech. It's mindset.


I'm more worried about people constantly believing that every new framework is a major advancement for programming and that it's not just something that could be learned in an afternoon (e.g. React). Or about people following the latest hyped trend without learning anything and without producing much other than more hype.

AI,ML and VR are all really interesting, but as we all know they are not completely new and will not likely account for the majority of the future jobs.

Fundamentals are what matter, most of these "new things" are just something that you can learn with relatively limited effort if needed. Classic programming skills, analytical skills or things like the ability to reason about concurrency issues never go obsolete.


A big part of it is that imho technology has gotten more powerful, but still no easy to use. Like, I'm imagining in 20 years the API for Amazon's drone fleet will be here and we'll still be like "wait, the address parameter is for an IP address? And the library isn't thread-safe but also doesn't connection pool so I have to maintain one dronecloud client per thread? And what's this cryptic "rotor invert" error?

It feels like half my job is googling how to do common things in popular libraries and finding completely counter-intuitive pain-points.


And learning the basics just doesn't seem to cut it anymore. It seems like most enterprise technology builds on older versions of older versions making it more complex. For instance, AI and ML seems like two subject areas that would be impossible for anyone to work professionally with, if they don't have years and years of experience with programming/math/statistics.
next

Legal | privacy