I'm reminded of a TED talk I went to that proclaimed all code is art and everyone should code. I'm just not convinced. When you are coding I get it, coding is awesome and it can be an art, but I don't always see coding to be an integral part of every day life for most people in the same way that reading and writing is.
Most people do use a computer every day however, so there is an argument to be made that people should learn it at some point to unlock the potential of their machines. But the problem you run into here is that coding is not the same today as it was 10 years ago, and much different than it was 20 years ago. I'm still reading and writing and typing pretty much the same way that I was after learning it for the first time (with minor improvements in speed and legibility). In 10 years though, I'm pretty sure I'll be coding in something else entirely different and in a totally different style. And I also have many devices that I don't bother to code on or interact with in an advanced way, such as my smartphone.
We should teach kids code at an early age, agreed, but we also shouldn't expect them to stick with it for the rest of their lives like they would reading and writing.
Let's replace "code" with "write." Please don't learn how to write. Only writers do this, and we really don't need more of them anyway. There was a time when this wouldn't have sounded crazy.
Learning to code teaches problem solving, new ways of breaking down complex scenarios, and a means to actually build something. It's true that not everyone needs to be a software engineer, but not everyone needs to be a mathematician either, and we don't use that as a basis to tell people not to learn math. Much like math, coding can be abstracted to a form of thinking in a way that plumbing cannot.
I have a friend right now that is using CodeHS to teach 10 year olds, and they're absorbing it like sponges. And I have met so many people that, in retrospect, have wished they had learned to code at a younger age. Maybe if people had told them to learn how to code, they would have.
I don't like to say that not everyone can code. I like to say that not everyone will enjoy coding. Personally, I find it more liberating to confront my personal challenges in learning something in terms of how much fun I'm having and/or how much work it'll take to get good at it.
For many years I told tried coding and told myself that I couldn't do it when it got to hard. But the honest answer is that I simply didn't want to put in the work.
Later in life that changed, and I started seriously working at learning to code around age 35 because I was having fun. I decided to go for it professionally at age 38, and got my first coding gig at age 40.
I'd agree that there is a huge disconnect between a bunch of us saying 'yes coding is a great skill - lets put it in the classroom' and the reality of the current state of education. There is a lot of sentiment about how coding is the new literacy... its not. Literacy is literacy. Math is math. You need to be quite good at both to then be able to program well.
I took a bunch of time and focused on helping coders instead of coding myself. I came to coding naturally and never thought of it as much more than intellectual self-stimulation. It was a lot of fun, and there is some value to it -- but it tends to get a lot more attention than it should simply because it feels so good to do it well. Successful startup founders say that coding is no more than 2-3% of the total effort of providing value to people. I've found nothing to prove that false; and I've seen a lot of companies and startups.
Recently I'm back to focusing more on coding. I find two things most interesting:
1. As I get older, I struggle with attention span and short-term memory more, but I have greater ability to see deep and widespread cross-cutting patterns. It's probably an even trade.
2. I'm not so sure that smart people should be coding. The more I think about it, the more convinced I become that good coding is managing cognitive complexity. You're always trying to make it work, then make it easy to understand and maintain. When I think back on all of the multi-billion-dollars I've seen in project/program disasters, none of it was because the problems were hard. They were all a combination of poor customer/user participation and smart folks taking a problem of n complexity and making it into a problem of n^n complexity. Usually the two were related: tech was constantly used as sort of a band-aid to fix people problems. It never worked, but it kept a lot of coders employed for a long time.
Hopefully this wasn't cynical. I love coding and I love making useful things for people. I have a deep passion for helping developers lead happier and more productive lives. But I also feel an obligation to be honest about what happens. It looks a lot different at 50 than it did at 25.
Think about it. Seriously, would you want somebody telling you that you wasted three years of dev time and could scrap what you have and roll something useful to production in a month? I've done that several times in my career, at various ages, and nobody ever liked hearing it. As I got older they liked hearing it even less. In this business, inexperience, raw intelligence, and enthusiasm are the things we reward. They come mostly with younger folks.
Tech development is amazing and incredible because we create our own realities. But part of that awesomeness is the fact that left alone, we create realities that look like ourselves. It is the nature of the work.
Disagree with this. As a 26 year software developer, it's been fascinating to see just how many of my peers in non-software industries are now interested in learning to code because it's useful for their jobs. Coding allows you to talk to machines, and machines are everywhere in modern society.
Here in the UK, basic coding is now taught in schools (from primary school!). And I think this makes sense. Not everybody's going to be an expert. But the basics are incredibly useful, just as with math, science, history, etc.
Note: most of the other things you list as things that people don't care about knowing are also taught to a basic level in schools.
This topic recycles on HN pretty frequently. I don't believe it. I have many friends that aren't coders and it helps me remember how big the skill gap is.
Unless you've done substantial training, code seems just as opaque as medicine, or law, or most fields of engineering. Any job that requires substantial training and trust not to royally screw everything up pays well.
Perhaps 30% of the population has the creativity and analytical skills to make a decent programmer. And maybe 10% of those find a job largely comparable to doing math homework enjoyable.
Programming isn't as old as most fields of engineering, but it's older than a generation of workers. If coding was going to be commoditised it would have happened decades ago
I think the difference is we all come in contact with writing every day. So literacy is required if you want to get around in the world and if you want to communicate with others. Almost no non-programmers encounter code in their day to day lives. To me it doesn't make since for the majority of people to learn to code, because without practice their skills are just going to atrophy. Why add a ton of new people who can only write shitty code? We already have enough of those, I don't see the reason to add more. Programming well is hard and if you don't spend a lot of time with it you'll never get good.
You are missing the point of his post which I believe is valid- that programming in and of itself is nothing but a skill to carry out a task. He isn't dissuading beginners from learning to code or suggesting that people shouldn't. Just that you shouldn't learn to code just for the sake of knowing how to code.
I get it. There are worse hobbies you can have and it can solve problems and spread information, etc., etc. The point is that programming should be viewed as a means to an end. Too much focus is put on coding as an end goal and writers on TechCrunch whom I suspect have never coded anything in their lives talk about how learning to code is the hot new thing and everybody should be doing it. I think it is far more useful to learn a variety of subjects, figure out what it is you want to do, and if building software is the way to achieve it, then go learn to code. But learning to code well is an enormous investment of time, so to learn to code you are choosing not to do other things that could be far more valuable to you.
A lot of people start with learning to code and then look everywhere for a problem to solve with their new skill set. That is the reverse of how problems should be solved. The technology is irrelevant. The skills are irrelevant. The problem is what should dictate the work.
Do I have to think everyone needs to learn to code in order to want to teach _more_ people to code or to want to advance the state of the art in coding pedagogy?
I don't think EVERYONE needs these skills. Most people will never need them. But:
- I think getting underprivileged kids excited about coding/making games/robots/etc may be an excellent way to provide more kids with a good future.
- I _really_ wish more legislators had backgrounds in software. I think concepts like technical debt, KISS, code reuse, etc as well as the general method of translating business requirements to code are more applicable to that domain than generally appreciated.
I've been coding for almost 20 years, and plan to do it until I am no longer able to, for all the reasons that the author mentioned.
I feel very blessed to be a coder. I think of coding as a form of alchemy. I sit at this keyboard, move my fingers around, and can create things of great value out of thin air. That's pretty amazing if you ask me.
This is a strawman. Coding has never been a prerequisite to creation. People were creative forever, and then computers came along and created new avenues for creativity. The vast majority of which is done with applications, and applications have improved by leaps and bounds every decade.
Only a very tiny portion of creative work requires actual programming, like demo scene programmers. Even video games and Pixar movies have far more artists then programmers. It's true that more and more programmers are required to extend and maintain the software, but the growth of the end-user base has far outpaced the programmers. The majority of programmers are implementing business logic of which they have minimal high-level creative input.
The things we can do with computers without code has only increased. Pontificating on how it would be better if we didn't have to write code is like someone who once used a circular saw pondering why can't artisanal wood carving also be done with simple straight-forward, easy-to-use tools instead of difficult-to-use hand tools. I mean its crazy how often we hear this refrain about coding as opposed to other professions. No one asks why can't we have a gadget to perform an appendectomy at home instead of paying a surgeon thousands of dollars, but for some reason people think that coding is somehow unnecessary magic which could be done away with by a bit of clever rethinking. But code is not some rube goldberg device designed to obfuscate and impress; code is a medium, like a blank canvas, or a sheet of typing paper, the only difference is it can control physical things. It is not one concrete thing with a specific purpose which can be optimized like a word processor, or a stove, or a faucet or a car. Less code doesn't mean a simpler world, it just means less of what code can do.
While I agree that in principle you can learn to code at any age, I think this article misses the entire point of every "learn to code" article. Whether it's kids, women, oldies etc etc.
Coding is a calling. You can learn anything if you have the calling for it. If you learn because you 'should' (ie, it's encouraged by others and or/money) you will end up being average at it, at best.
Coding is not for everyone, it's a pipe dream. It's not because all modern tech is based on it that you should learn it -- same way as Lewis Hamilton doesn't need to be a car mechanic to be a good driver.
I've started programming at 12yo, got my first paid gig at 16, and I haven't stopped being a 'programmer' (as they used to be called) since, and I'm now 49yo. I can't imagine NOT doing any 'coding'. In that time I saw countless average/bad programmers (and there are more and more of them) who /probably/ would have been better trying to follow a calling of their own...
As far as having a successful career as a programmer, I think there is one major trap I've seen many, many programmers fall into. And that is 'mind sclerosis' -- ie refusal to evolve with the time/tools. People who are actually quite/very good at one set of tools sometime refuse to evolve and down these favorite toys in favor of what is perceived (often, they are) as inferior replacements. Problem is, 5 years down the line, they are no longer employable. I've seen that too many times to count.
The NEXT trap is to realize that your time is limited. You can't learn everything and you have to be very, VERY selective on the set of tech you want to invest your time in. Don't bangwagon on stuff -- ask yourself whether it's likely to be a selleable skill -- if not, don't waste time on it.
I'd say coding is really a lot like learning a martial art. You get exercise (brain/body), you can use it to overcome difficulties (formatting complex documents/leg-sweeping a belligerent drunk), and you can develop it into a career if you focus. But really it's just a set of tools that have a wide ranging use. Of course, you also have to practice so you don't get rusty.
I think everyone who has a mind for problem-solving should learn to code. Heck, even people who aren't especially right-brained can use code to be creative and artistic. And sometimes it's just fun, damnit.
I've been hearing this argument a lot lately. Are you suggesting that if I wanted to learn to code it would take me decades to be able to build anything of value? To me learning to code is just like learning any other skill. Sure you wont be good at it the first few months you start but that is what practice is for. I don't see why in a year or two you wouldn't be able to build something of value. If it really took 10 years to be able to build anything useful I doubt there would be that many coders. I can't really think of any skill that takes 10 years to become good at.
Highly disagree with the author. Not everyone who learns to code needs to go on to become a software developer or engineer. I actually really like the idea of coding being another skill like sewing, cooking or arts and crafts. Something that some people do professionally, but most people have a basic understanding of for when they need it.
As a developer, I agree with this. Huang may play a marketing game here, but he’s right. “Coding” isn’t something humanity needs to do, because coding in textbooks and coding in reality are two different things. Our IRL coding is a bunch of self-inflicted and cheered complexity and an artificial barrier that secures the jobs. AI has a very good potential to distill coding back to the essence and then some more. Not the current AI, but 2024 isn’t the last year of humanity either. So coding isn’t dying, but coding as we know it should die anyway. Should have, long ago, but it’s so compelling to just stay in a comfort zone of being a software developer. He claims everyone will be a coder, and that I find realistic. Of course people incapable of even easiest levels of “STEM” will exist, but the bar will lower dramatically.
I do believe that everyone should learn to code, but not be a professional software dev. They are vastly different things. Coding can be just automating a task to reduce the human factor or making it more frictionless, with throwaway code of 100 lines.
I disagree. I think with the right levels of exposure at the early stages of education “coding” can become a proficiency just as common as reading, writing, and arithmetic. Of course, just like those skills, people will pursue programming to varying depths, some will determine to study advanced topics while others will remain “coders at the 6th grade level”.
Not everyone will go into CS and develop new data structures, algorithms, or programming paradigms—but most people can and should be able to code solutions to well understood problems—just like mathematics this doesn’t mean everyone will know how to apply it in novel business situations, but just as most people can perform easy calculations they should be able to write straightforward programs.
It’s all a matter of the appropriate cultural and educational scaffolding. Programming is still in its infancy and hasn’t permeated through the education system deeply enough yet to enshrine it as a core proficiency, but we can and should head in this directon.
I tend to agree with this widespread statement of "coding is the new literacy".
I don't think that everybody should make programming their life's work, just like not everyone is earning their daily bread by writing. But given that quite about everything around us is programmable, someone who has no concept at all of how programming works could be left behind.
Most people do use a computer every day however, so there is an argument to be made that people should learn it at some point to unlock the potential of their machines. But the problem you run into here is that coding is not the same today as it was 10 years ago, and much different than it was 20 years ago. I'm still reading and writing and typing pretty much the same way that I was after learning it for the first time (with minor improvements in speed and legibility). In 10 years though, I'm pretty sure I'll be coding in something else entirely different and in a totally different style. And I also have many devices that I don't bother to code on or interact with in an advanced way, such as my smartphone.
We should teach kids code at an early age, agreed, but we also shouldn't expect them to stick with it for the rest of their lives like they would reading and writing.
reply