Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

You're right, I guess I'm comparing against both those axes at the same time, since I'm relating to my own experience as a kid 30 years ago vs as someone working in tech today. Indeed, that's not really a fair comparison.

The point I was trying to make was about the distinction between programming with a limited set of building blocks vs programming with a large and open-ended selection of libraries. They don't require the same skill set, and one seems more fun to me than the other.



sort by: page size:

Direct access to hardware is not the difference. It is easier to get things done today. You can argue that it was simpler in 1985, but I don't think that simplicity is the right way to measure things. You can present an easy-to-use programming environment to people, accessible to newcomers, usable by experts, and it doesn't have to be simple but it has to be possible to use it in simple ways.

Just personal observations here. Someone with a computer in 2019 can download Unity, which is free, and watch a tutorial on YouTube, which is also free. With near zero understanding of what is going on, and no prior experience, they can have some kind of rudimentary platformer working within hours. This is then a good starting point to learn programming (you can dive into C#) or you can continue to jumble together copy/pasted bits of code that you see online (kind of like how I remember doing with BASIC and library books).

Sure, there are a bunch of layers of abstraction, and those abstractions will break down all the time. You don't need to learn those abstraction layers, you can stay at the top layer and still get good work done, maybe working around a few problems that you don't understand from time to time by futzing around with stuff until it works. Or you can dive in and try to understand what's underneath an abstraction. That's the whole point of having abstractions in the first place. They exist to hide the complexity, to let you get work done without understanding the entire system. 72 layers of abstraction is a bit of an exaggeration. I'd say if BASIC has three layers (interpreter/machine code/hardware), Unity only has six (editor/scripts/engine/os/machine code/hardware), but who knows how you count them.

In my experience, watching people learn how to create things, it has never been easier to learn programming and start building things. The main difference is that in 1985, programming was considered an essential computer skill, and in 2019, you're expected not to program. BASIC was amazing because it was what you saw on the screen when you turned your computer on, nothing more.


True, but on the other hand those technologies have evolved more or less drastically as well. E.g. the Java or C++ skills from 1995 won't get you that far today, both in terms of the language itself and the framework/library ecosystem.

One difference I recall is you had to read manuals first. Now, you just code, and when you get stuck: google or stack overflow. But with no good search engines, you couldn't search books like that, so your chance of finding an answer by searching was poor. You wanted to have read the book first, so it was in your head.

Also: way fewer libraries. You might write ALL of an application. You might call the OS to just to read/write files. Today is much more gluing together libraries, which is nowhere near as much fun.


I never implied that.

What I said was over the years with the help of tools it takes lesser intellectual work because the more and more skill based intelligent work is automated inside IDE's and frameworks.

You can do a lot of network programming today without actually understand anything about networks. You could not have said the same around 15-20 years back.


Sorry, I don't agree. I have not been in the industry for 30 years, but I have been programming since a very young age on similar hardware to your Commodore 64. No I cannot translate the skills I learned when writing Basic for that 8 bit architecture directly to the modern web app project I am working on. I can however relate better to memory constrained environments such as Arduino's and MSP430's. I can say that getting started with a very limited piece of hardware and a very limited language did get my interest peaked to keep exploring what I could do.

Later I got into Linux and FreeBSD. While I have never worked with any BSD professionally (other than OS X, but that doesn't count), I do think that knowing how FreeBSD works makes me a better Linux user/admin/developer. Certain ideas (kqueue, core system vs packages, etc.) are good concepts to keep in mind when developing for a platform that does not have them.

I also learned Turbo-PASCAL and used Delphi for a spell. While not directly relevant, they did have quite an influence on today's languages and development environments. NetBeans which I did use professionally for a bit was a far cry form Deplhi but knowing both made NetBeans easier to use.

But all that aside, the point of having years of experience is not about enhancing your current knowledge. It's about enhancing the process of acquiring knowledge, organizing it, and using it. Someone that knows JavaScript is going to get coded under the table by someone who knows JavaScript, C, Haskell, Erlang, Lisp, Python, etc. In fact I would wager that someone who knew C, Haskell, Erlang, Lisp, and Python but not JavaScript would in the long run beat out a JavaScript expert simply because the penalty to learn a new paradigm is much smaller than perceived, while the benefit of being able to think in a multi-paradigm fashion is a huge benefit.

Finally, the mechanic thing: I am comparing a veteran mechanic, with say 20 years experience, including current experience with latest cars vs a mechanic with 1 year experience with just modern cars. I listen to CarTalk, the NPR program, and they had a few very interesting stories on there. For example, there was a woman who called and said that whenever she turned on the fans in her car it smelled like gasoline and it very often happened after she got her car worked on at the dealership. The suggested reason was that when the mechanic worked on it, he put dirty parts on the cowl of the car where the air intake is and some gasoline and oil dripped into it. This is something experience teaches you and it has nothing to do with the modern chips.


The core difference was that the job required much more vertical reasoning. Crafting things from the ground up was the norm. Starting from a blanc code file and implementing core data structures and algoritms for the domain was oftem the case. Limited resources required much more attention to efficiency and thight constraints. Much weaker tooling required more in depth knowledge rather than trial and error development. There also was no web nor google so finding things out was either books or newsgroups.

These days often the demand is more horizontal. Stringing together shallow understood frameworks, libraries and googled code and get it to work by running and debugging.

The scope of things you can build solo these days is many orders of magnitude larger than it was back then.

Still, the type of brainwork required back in the day most definetly was more satisfying, maybe because you had more control and ownership of all that went into the product.


To me the big difference is the appearance of libraries and frameworks. Back then your code depended on or built on very little pre-existing code. You wrote to devices such as the screen, tapes, disks and printers almost or even literally directly. If you needed a data structure more complex than a discrete value or array you had to include the underlying code yourself every time. Every program was a creation ex nihilio because even if you re-used code you had to copy or write it in again yourself. Environments like Delphi and VB or later versions of Turbo Pascal which came with library code were a revolution.

In reality, I'd wager there's a _lot lot_ more kids programming today than 30 years ago. How many people had a computing device at home back then?

This anecdote makes the point that a very specific kind of thinking was better suited by the tools of the late 70s, and maybe - I've never been much inclined towards it, so I dunno.

And what we consider 'programming' is blurrier than it used to be. Is HTML programming? What if you use a fancy editor like dreamweaver? Only when it becomes dynamic? Does CSS count? A lot of youngsters, when they aren't getting off my lawn, have done web stuff, and the fact that it's a gradual shift from being a user to being a programmer probably makes it even easier.

I find the whole premise of this blog post flawed. (Great Scott! Someone disagrees with someone else on the Internet!)


I'm "only" 39, but it's been 30 years since I've written my first line of code. And I have to say that while things have changed a lot, the basics of how it is to program and how you write code, are still the same. I find myself applying techniques I've learned back then, and even though I haven't written assembly since the 80s, the basic low-level understanding of how it all works still serves me to this day. The main difference is maybe how accessible good info is now. Back then each programming book was a treasure that was passed around.

I often wonder how much of that will stay the same 30 years from now, and I hope I'll still be hacking then. My bet is that we'll still have some versions of programming languages we know today, maybe even Linux running it. But it will be much more exciting if that won't be true :)


Nobody's arguing that it isn't easier today for the professional, but that doesn't really lower the barrier of entry for the barely interested. Never mind that we simply orders of magnitude more people with computers now, so that even with a drastically lower percentage of people actually starting some kind of dev environment we'll just have to come out on top.

Yes, it's easier to create the good stuff (well, at least web stuff and mobile apps…), but I'd doubt that it's as easy to create silly crap that you're inordinately proud of. That may be nothing more than a simple question/answer loop, but even with that you created something and you understand some basics of how a program operates…

"Why Johnny can't read" was about literacy, not everyone going out to be a journalist or Hemingway. I view Brin's piece in the same light.

But yeah, we really shouldn't get lost too much in nostalgia. Things aren't that bad and projects like the Raspberry Pi show that this subject is tackled today. For me, the lesson to learn from BASIC environments aren't about forcing people to relive that nowadays, but that the programs that resulted out of the very first days with such a computer are interesting for educational purposes.


The ability to reach is a problematic one; consider e.g. Google's Play Store or Apple's App Store in which you see literally thousands of new products appearing daily. Technical excellence is secondary and often neglected(see e.g. WhatsApp's security issues!) which results in poor quality software and poor progress.

Simply the barrier of entry is much lower, which in turn reduces the programming talent. I'd be interested to see what the quality of software would be when average 65 yo granny would be reading API documentation of some "web app" because it's just so easy and it just works. :)

I think the difference between "average programmer" 20 years ago and now is the fact that back 20 years ago you had nothing fancy. A text-based editor, command line and a compiler. You were perhaps able to set graphics mode with some obscure commands and get some pixels on the screen. And that was just so fascinating. You really had to me one-in-a-hundred kind of person to be proficient with that stuff. If you put an average programmer of 2012 to such situation, I really bet a huge percentage will just drop the hobby and do something else. Though, on the other hand, things were so simple back then. What does it take these days to have access to the screen pixels as an array? Libraries, frameworks, documentation, APIs and whatnot. Back then you just moved a value to CPU register and called a certain interrupt and voila, 320x200x256 graphics mode with framebuffer starting from a well-known address. Then it's just a matter of writing to memory addresses, very simple. Couldn't be any simpler really!


Programmer education, tools, language standards and best practices are all vastly different than 50 years ago. That's like pointing at a Ford Edsel and then claiming that modern humans can't make good cars.

My take is simply that programming has gotten way bigger. People seem to think that in the “old days”, programmers ate raw potatoes and programmed in assembly because it put hairs on their chest. The reality is, computers were way simpler objects back then. Assembler on the Intel 4004 has like, 30 opcodes or something. The list fits on your phone screen without scrolling. Modern javascript bundlers pull in more code than entire operating systems back then. C only has about 20 keywords. If you know assembler, you can probably learn K&R C in a weekend if you go hard. Modern C++? Forget it. Apparently the spec just for c++ initialisers is 300 pages long.

Today the same amount of knowledge makes you barely passable in a single niche domain. Consider web development. To get really good at modern web development, you need to know modern javascript, CSS and html. You need to understand how browsers work and all the quirks of how http requests and loads webpages - including what dozens of http headers do and how they interact. You need to understand the browser rendering process, performance tools, accessibility and debugging tools. And learn dozens of javascript libraries, like react, express, webpack, database wrappers, and so on. It’s accomplishment to learn all of that. But if you do, you still only know web programming. That knowledge doesn’t really translate to operating systems work, mobile development, databases, AI, embedded, etc.

Most professional programmers only have the inclination and capacity to learn one ecosystem. And, even then usually with big holes in their knowledge. True polyglots are rare because the mountain you need to climb to get there is higher. But we also depend on polyglots to guide us toward useful tools. Language / ecosystem choice still matters. It matters for performance, velocity, security and compatibility. But how can you really evaluate that stuff unless you’ve spent time debugging Go programs, or tried to squeeze every last drop of performance out of a big legacy Java monolith?

We’re left talking imperfectly from our own experiences. And living in whichever niche of programming we’ve carved out for ourselves. The days of everyone being all terrain programmers is over.


I'm not sure that's true at all. A lot of us older developers look at the tools we used in the 1970s, 1980s and 1990s, and we can't help but notice how much better they are, even today, than many of the most-hyped tools these days.

I feel sorry for many of the younger developers today who only know of JavaScript, PHP, NoSQL and web development. They don't know what they're missing out on, nor do they truly know how inferior their tools are.

And when it comes to getting serious work done, we still use C, C++ and Fortran today. Yes, they've advanced in many ways over time, but I think they just go to show how much better many technologies were in years past. Even modern tools just can't compete with them.


I recently gave a software engineering talk at a high school's career day, and a student asked how the industry has changed, and it caused me to think about this. In a lot of ways I think it's a wash. We could do less with the tools back then but we were expected to do less. In the 80's we had Borland C++ or VC++ to write our text-based screens and BTrieve-based databases on the same machine. The tools have expanded considerably but we're expected to spin up databases, code for them, handle multi-user, multithreaded (or async) coding, GUIs or mobile web-based UIs, use different languages for each part of the puzzle.

In the 1980s a spell-checker was a very complicated project and now it's simply a hashmap lookup. But it's not because we invented the hashmap since then, or nobody in the 80s would have known that, it just wasn't possible with the RAM limitations at the time.

Likewise, moving from ASM to C to Smalltalk is a night-and-day improvement... that we made in the 1970s. The difference again is what hardware we get to run it on.

Video game construction kits existed in the 80s, and 80s GUIs looked pretty much the same as they do now if you ignore resolution.

Drag and drop application development was huge in the 90s, and the CRUD applications of the time weren't significantly different than now, other than they didn't run in a browser. HTTPS wasn't hard back then, and you could write a Perl endpoint in half an hour easy.

When it comes to things like playing video, it's easy now but I don't even really consider it programming. You're just installing software that does it for you.

Libraries and OSS and StackOverflow and various services have made a real difference. I'm not arguing that we haven't made progress. I'm just saying that I can see how some people feel that this year's incremental advance or retreat is not nearly as exciting if they've seen how the last 20 worked out.


> I don't think those are comparable to complexities in IT that exist today.

I think the constraints and lack of “here you go” resources many starting programmers dealt with, even for toy applications, out of necessity in the 70s and 80s are better preparation for the attitude necessary for dealing with the complexities in IT today than the learning conditions today.

Which isn't to idealize it, it was also a hard onramp that drove lots of people off that would have done well at many real world problems where they weren't soloing without support, and night with experience have still developed to great solo practitioners, too. But the people I've encountered that came through that 70s/80s start tend to be, IME, on average more willing to slog through and learn hard stuff in new areas than people who came up through, later, easier onramps.

Though that may also be in significant part survivorship bias, as the 70s/80s crew will have had to have stuck around the field longer, usually, and it may be people with that flexibility are more likely to stay in technology past the half-life of whatever was current when they got in.


Programming today is easier in many ways: Information is readily available for free (I recall saving up a lot of money for a kid to buy specific programming books at the book store after exhausting my library’s offerings). Compilers and tooling are free. Salaries are much higher and developers are a respected career that isn’t just “IT”. Online programming communities are more abundant and welcoming than impenetrable IRC cliques of years past. We have a lot that makes programming today more comfortable and accessible than it was in the past.

However, everything feels vastly more complicated. My friends and I would put together little toy websites with PHP or Rails in a span of weeks and everyone thought they were awesome. Now I see young people spending months to get the basics up and running in their React front ends just to be able to think independently of hand-holding tutorials for the most basic operations.

Even business software felt simpler. The scope was smaller and you didn’t have to set up complicated cloud services architectures to accomplish everything.

I won’t say the old ways were better, because the modern tools do have their place. However, it’s easy to look back with rose-tinted glasses on the vastly simpler business requirements and lower expectations that allowed us to get away with really simple things.

I enjoy working with teams on complex projects using modern tools and frameworks, but I admit I do have a lot of nostalgia for the days past when a single programmer could understand and handle entire systems by themselves because the scope and requirements were just so much simpler.


Your comment is basically akin to complaining how rough and slow it would take a person with a machete to trailblaze a path through a dense forest as opposed to how long it takes a person to travel the beautiful paved road that was built on top of that trailblazed path. Somebody had to thrash through all the shit for that superhighway you are traveling on my friend.

I was around back then. We didn’t have a magical browser box that you could type a couple of key terms into to get a thousand articles, code samples, philosophical discussions from hundreds of smarter people than you who have already solved your problem a dozen different ways and got to chose, and improve one. You didn’t have hundreds of languages, libraries, and frameworks where you could selectively pick the right tool. Back then you had your problem and you experimented and invented until you solved it generally with the one or two tools that were available at the time.

And with the benefit of nearly 40 years in tech…I can attest to the OP’s opinion that the quality of the average tech worker has nosedived since then.

next

Legal | privacy