Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

>That's left me wondering: will there be a point at which I struggle, after having spent decades writing code that powers the web?

I'd say it's not that tech will become too difficult to use, but the constant and frequent changes will start to annoy you to a point where you just don't invest as much time into learning how the new stuff works. YMMV, but I am less than half your dad's age and I'm already starting to feel it.



sort by: page size:

> And why is it that most older people answering on these threads are so passionate about learning and about new technologies and the latest and greatest javascript frameworks. Do they really enjoy having such ephemeral knowledge and basically competing with anyone that's finished a bootcamp or not even that?

Nobody enjoys having their hard-earned knowledge become obsolete (hence the "X11/bash/sysvinit/etc were good enough for me so let's never improve them" crowd).

But the fact is computer technology does change fairly quickly. You have to keep learning to keep up.

If you're really worried there are definitely some technologies that persist for longer than others. If you really hate re-learning stuff then I'd stay far away from the web and javascript. Stick with things like Java, Go, C#, C++ & Rust. Those aren't going away any time soon. Ruby, Docker, React, etc... I give 5 years max. Then you'll have to learn something new.


> I've been around since the beginnings of the WWW

Ditto.

> I haven't gotten stupider; the software has become more difficult to use.

I can't speak for you, but I'm becoming less interested in new shiny in a lot of things beyond UI widgets. There's a reason why we olds have a reputation of falling behind, and it's not because engineers and inventors explicitly make things that only young people can learn.


> Meanwhile, I'll likely be seen as a dinosaur in tech by that age and will be lucky to find work at all

My dad is in his 60s and is still doing cutting edge work on Kubernetes, Golang, eBPF, etc in a big tech company in the Bay Area. It honestly isn't that hard to keep yourself up to date with technology looking at his experience.

If you can't get yourself interested in upskilling or learning the next new paradigm you're in the wrong field.


> ...know how computers work...

That's what worries me. There was a time when I could write low level C and even some assembly. Similarly there was a time when I could explain the ins and outs of IP packets and routing. However, none of that is what the market wanted and I don't remember any of that anymore.

Heck, with most of my work involving running around trying to hold understaffed systems together and juggle the unreasonable demands of management; I don't even have the programming/web development skills that I used to.


> Is it going to get harder soon?

I'm 34, so it might become different when I'm in my forties, but so far it's not getting any harder. Your summary below is spot on.

> I understand how computing systems work generally and the laws of physics impose some hard limitations that sort of massage any solution to look similar enough to any other that it isn't that hard to figure out what is going on. Sure I'll lack some of the deep knowledge that someone who has focused solely on these set of technologies may have but honestly most of the programmers I meet don't have that to begin with.


> The pace of change is faster than the average person can keep up with and still be competent.

I'm only 35, but until now I feel the opposite is true.

I've started to learn C, HTML, CSS, Javascript when I was a teenager more than 20 years ago. The knowledge and skills I learned then are still as relevant today as they were then.

Sure, there's flexbox now instead of tables, but the basic idea of HTML vs CSS vs Javascript is still the same, I can still make a basic website for a project in a few hours. Vertically centering something on the screen is still harder than it should be.

C now lets me declare variables anywhere, and there's a nicer syntax for initialising structs, but pointers work the same as they did 20 years ago. Knowing C is still as useful today as it was then.

My main focus for some years has been Mac desktop development. While a lot of new things was introduced in the last decade, it has all been mostly incremental changes, and even if you skipped a few years of progress catching up isn't that hard.

It helps to be aware of some general trends. I've read about async/await years ago when it was introduced in C# I think? and now it's coming in Swift. Sure, it's a slightly different way of thinking about concurrency, but one or two new concepts every 5 years should be doable.

Also, you don't need to learn everything. For example, the whole Reactive / Flux technologies with event sources and subscribers sounded really interesting, but I never really was able to get into them (you can probably tell by the fact that this paragraph makes no sense), and I'm doing just fine.


> At 44 I have the opposite problem. The more I learn, the more I realize I don't know.

I'm 47 and in the same place. I have only recently come to grips with the fact that I am a good programmer and I do know what I'm talking about, but there's also so much that I still don't know. I keep learning, but you've got to pick your focus, because you can't possibly learn everything.


> Is the opportunity for this type of magic still available?

Totally! The landscape was/is definitely different since there have been so many abstraction layers placed on top of the hardware, but even web programming is a fun first experience that creates that “woah I made that happen??” feeling.

Personally my first real exposure to programming was playing with Atmel microcontrollers after watching some youtube tutorials and writing video game cheats after seeing a friend use one and wondering how the heck it worked. There was a lot of stuff to figure out, but each time you conquer some new detail of the overall system you’re immediately hungry for more.

It’s actually assuring to hear folks older than myself ask these questions, because I find myself looking at the increasingly locked down computing tech of today and wonder new folks are going to get into this wonderful hobby if there’s no ability to mess around with it.


> It's possible we achieved such a technological peak that future generations won't be able to understand the basics

I don't think it's this, and I don't think programming is so hard (it's the business logic that is hard when you have to specify it exactingly, to reference the thread.) I think that the manufacturers of the various computers we use make it unbelievably difficult and scary to touch anything, and cast quite a bit of suspicion on you for even wanting to change anything.


> Any coding you've done more than five years ago isn't relevant anymore. I can't remember what I did five years ago. It is probably outdated, replaced or not in use any more.

I tend to disagree. Not only a lot of five years old technology is still relevant today, but someone that showed great skills 5/10 years ago has still great potential today.


> there is no guarantee that whatever tech you learn will stick around.

Ok, but there are plenty of aspects of using tech that transcend the lifetime of the actually device you are using. Coding, of course, is timeless, but even the basic idea of experimenting/discovery within a UI is something that many older adults lack from not having tech when they were younger.


>computer science & IT people tend to be nerds/geeks who like to tinker in their free time and as a percentage of the population they stay relatively the same from year to year

Why do you say that?

I think you may be thinking too much in the short-term. I was actually wondering what programmers in their mid-20s today are going to be doing when they're, say, 70. How many programmers in their 70s do we have today? Very little, because computing was in its infancy 50 years ago. With every generation, more and more are being exposed to computing/programming at an early age.

You also have to remember that there are tinkerers in not just the computing field. Think about the weekend woodworkers, car guys, model aircraft enthusiasts; could many of them be tinkering with software if they were exposed and educated in it at the right age? I say yes. On the macro scale, we are only at the dawn of computing and the internet, and there are going to be MANY more people working on it in the future.


>I am a professional computer programmer. If I can't figure out how to reset a password on Skype, then what are the chances that a less technical person can do it?

The days of 'being good with computers' are gone. UIs change so rapidly that core skills are useless in comparison to daily use. You could be a linux guru with decades of database dev skills, but any 14yo youtuber will probably navigate a website better. I'm a lawyer. I know lots about tax law, more than most accountants. That doesn't help me navigate the tax office's website. My accountant is the expert there. (Actually, even she has someone who does the web stuff.)


>> "But at some point in time, it will become a roadblock to not know how to write a basic script to automate some repetitive/remedial task."

Isn't that the opposite of what's happening? 20 years ago it was necessary to know stuff like that to make full use of a computer. Now we have programs like IFTT.com or the more powerful Automator on OS X which allow us to do those things using a GUI without having to understand ho the scripts work or are written. Why can't we continue building tools so that people don't have to waste time learning the nitty gritty? Computing has been getting easier and easier so much so that 1 year olds and 90 year olds can use computing devices. Why does everyone seem to think that in the future we will all be coding when that has been becoming less necessary as time has passed?


> I started programming on my own at 7... Mainly because my IBM PC booted into a BASIC window if

Same here!

Getting started in programming seems so much more daunting these days.


>To code, for example, you need to hook a bluetooth keyboard and mouse.

I have a hunch that coding as we know it is going to look very different in ~5-10 years.


> Computers and programming are too accessible, and I think are the first thing to become so accessible.

But are they? Computers have become insanely complicated.


> A sense of mastery and adventure permeated everything I did. Over the decades those feelings slowly faded, never to be recaptured. Now I understand nothing about anything. :-)

Are you me? ;) I feel like this all the time now. I also started in embedded dev around '86.

> Nonetheless, this dinosaur would gladly trade today's "modern" development practices for those good ol' days(tm).

I wouldn't want to give up git and various testing frameworks. Also modern IDEs like VSCode are pretty nice and I'd be hesitant to give those up (VSCode being able to ssh into a remote embedded system and edit & debug code there is really helpful, for example).


> I hope that we're not entering a world where having a bunch of computer equipment automatically makes you suspect.

I already get funny looks in Cafe's when I'm in a shell on a server/writing code.

You'd think these days people would be more used to it but where I live isn't exactly the tech capital of the world.

next

Legal | privacy