>I often think about technical literacy. Programmers are in this bubble where we think "oh, I'll just use grep and regex" and we can solve all sorts of problems. But 99% of people don't even know what those things are, much less have the foundation necessary to use these tools effectively. This sucks, because technical literacy does not seem to be growing at a rate that matches our societal dependence on these systems.
Even working in IT this is an issue. For something I used to do we needed to take a bunch of input and turn it into a basic csv before it could be used. For example, feeding a bunch of servers names into some kind of script. People kept coming to me to help them put this together, because I was decent with a text editor. Eventually I decided to make a little website that let people dump in all their garbage data; it would clean it up and format it in various formats, depending on what they needed it for. I figured one or two people on the team would get some use out of it. How wrong I was. It was being used hundreds of times per month. I was attending a demo for some other team a couple years later and someone on the demo used it to get his input in order before using whatever he was trying to demo. Some of these seemingly basic things can be huge for enabling people to do their job effectively.
I always thought I shouldn't have gone into IT, but something else... anything else... and simply used computer skills as a super power. The productivity someone with some basic computer skills can have, compared to the others who don't, is crazy.
> They -are- essentially tech illiterate once they step outside of google sheets, etc, or the web browser in general.
25 years ago there were people who were essentially tech illiterate as soon as they were put in front of a computer, not just when they stepped outside of Google sheets or the web browser in general.
You think having grown up with computers, in a world where technology is easily-available and part of pop culture means nothing, but it has been years since I've last had to explain a grown-up that some computers are faster than others, that something which is available on paper can be made easily made available on a computer (scanning, photography etc.), what a server is, or that a website can become unavailable if too many people access it.
It used to be that writing for a non-technical audience entailed unthinkable effort for seemingly trivial things. Not just explaining technical terms. We had lists of words we had to be especially careful with because they also had non-technical meanings, and if someone wasn't aware of that, they could get the wrong idea. That list included words like "server", "mouse" and "window", and I've personally seen grown-up people standing behind imposing desks who got really confused upon hearing these words without the proper context.
Eight year-olds who just learned how to read and write can pick up a word processor with way less effort than eight year-olds (and non-technical fifty year-olds) could twenty years ago. They understand, for example, how scrolling works. Teaching people how to use a word processor used to entail explaining things such as "you can keep typing once you're at the end of the page". You needed to say that explicitly because the interface didn't seem to have any way for you to get another sheet of paper once you filled one up. People would write things on a page until they got near the bottom, then proceeded to stare quizzically at the screen wondering how they could get another one -- because that's exactly what they did in real life.
I'm not saying that people who grew up with computers don't need to be told what TCP is and what a load balancer does. But they do have an intuitive understanding of things that a non-technical audience from twenty years ago didn't have. Including super basic things that you take for granted now, but it used to be that you couldn't -- that things from the Internet don't show up on your computer instantly, that how quickly they do depends on the quality of a connection, that computers need to be given explicit instructions each time (i.e. they'll always ask if you want to save your work before quitting -- they'll never figure out that you always say yes).
>>There is also some contribution from lack of technical literacy but honestly even some of my 30s IT coworkers lack that
My favorite explanation of this was an analogy to cars. Early drivers had to be mechanics, able not only to drive, but maintain and service their cars. Modern drivers just need to know how to drive, and often have no idea how a car works...
I think that maps well to computers. Early geeks had to know how to build, upgrade, program, network, etc. Current users just have to know how to _use_ an application and often have little idea of the 'behind the scenes' stuff..
> I have a very similar anecdote. I had a long conversation with a friend who is a high school science teacher. She told me that computer literacy has plummeted in the last ten years.
I suspect it is the same progression as any other new technology that undergoes mainstreaming. Take automobiles for example. In the early days if you owned a car you either made yourself something of an expert (and if you were an early buyer you were probably kind of an enthusiast already) or you hired one. Today outside of enthusiast circles they are just an appliance: you get in, turn it on, and go do whatever it is you need to do.
>I am a professional computer programmer. If I can't figure out how to reset a password on Skype, then what are the chances that a less technical person can do it?
The days of 'being good with computers' are gone. UIs change so rapidly that core skills are useless in comparison to daily use. You could be a linux guru with decades of database dev skills, but any 14yo youtuber will probably navigate a website better. I'm a lawyer. I know lots about tax law, more than most accountants. That doesn't help me navigate the tax office's website. My accountant is the expert there. (Actually, even she has someone who does the web stuff.)
>There's a ton of Stupid Computer Shit we all know, mostly from messing around on our own time
This, I think, is the core issue. Like most of us here, I started playing with computers as a young child, from a Timex Sinclair 1000 as a pre-teen to a Commodore 128 as a young teen, then an Amiga and beyond. It wasn't work, it was fun, a hobby. And more importantly, by the time I hit college, I had literally thousands of hours of practice just being immersed in how computers work.
I started tutoring other students in CS for some beer money, and it was a real shock at how awful otherwise very intelligent people were at what I considered utterly trivial questions. But they weren't trivial. They were only simple if you already had a complex, detailed and well-worn model of how computers work running in your head. Without that, even simple computer tasks may as well be written in cuneiform for all the good it does the genuinely new student.
I'm not even sure it is possible to take a young adult who is truly computer illiterate and have them succeed in a technical major. At least not in a standard 4 years. There is simply too much foundational knowledge you need to have before you can even begin the real work of learning what to do.
> I'm curious - how do you reconcile thinking they're very intelligent with thinking that they're utterly unable to grasp the basic concepts of computing?
Understanding computer systems requires both cultural and institutional knowledge. Not all intelligent people possess this knowledge. My grandpa is a great example. He’s a retired English professor and has deep knowledge of literature and the structure of English language. He’s still incredibly sharp but he cant use a computer to same his damn life.
> as an industry we'd rather tell ourselves that it's all so impossibly hard and beyond the grasp of the common person rather than admit that we just suck at explaining things and educating people outside our bubbles.
Using computers is very hard. I think you are underestimating the amount of time you and others have spent learning “basic” computer skills (email, word processing, web browsing, etc) that many haven’t had the time or opportunity build. I started learning how to use a computer when I was a little kid, maybe 5. over the course of my childhood I built up those skills to the point where they endemic and felt simple, even natural. These things take time and purpose and many very smart people do not get the opportunity or reason to build these skills
> How do you think people learnt to code before the internet? I borrowed large textbooks on Basic/VB, C++, and Turbo Pascal from friends and libraries. And this wasn’t a long time ago. I did this as recently as 2005.
I'm not denying that, I'm just sating progress was way slower - especially for newcomers before the internet. Imagine trying to solve problems without Google or Stackoverflow when you're a 15 old kid trying to learn programming. You're stuck on some shitty installation of Linux or some missing package and have no idea why the compiler gives this error message. How do you even get Linux? Have no idea how they did stuff back then but it was for sure harder to just to get something up and running.
So easy to give up.
Programming in that era used to be something 1 in maybe 50 children tried. It was no way near as accessible as it is today.
The flip side of that is that since it's now easier for everyone to learn new skills and solve problems, the requirements and expectations from workers pretty much went up proportionally.
> Hell, History degrees at Oxford now have the option of learning how to use databases to store and query information with SQL.
We should for real start teaching basics of query languages in high school. Just enough to demystify the subject for when "tech-averse" folks pragmatically need it for their profession
I've suffered emotionally observing people from non-tech areas toiling with what, to us, are rocks and sticks. Folks that would undoubtedly benefit majorly from learning a tool do not do it because they just have never had any exposure to the principles behind them
We can't fix people's interest in tech being low - we can introduce them to simple helpful concepts early on so they are more accepting of proper tools for complex jobs
Did this sound too exclusivist or tech-centric arrogant? I didn't mean to. I'm really interested in why some things like version control aren't used across all industries and I suspect it has to do with fear of command lines and inspection tools
> This gives the impression that not teaching kids to code is somehow equivalent to not teaching them to read. That is, of course, ridiculous. Coding is not the new literacy.
Except coding is like literacy. Not everybody needs to be a professional writer, but there are few jobs today where reading and writing isn't a necessity and even fewer where it isn't a useful skill. Outside of work, reading and writing can be sources of pleasure and applied in many different hobbies and disciplines.
Coding is the same way. If you know how to do it, you may see ways to make your life easier by programming that someone who doesn't know how to could never even imagine. sure, lots of people do fine without the skill, but when reading wasn't taught to everybody, those people also got by without that skill.
There was a time when using a computer and programming one were much more tightly coupled than they are today, but with computers more powerful than ever and users less savvy than ever, their magic is lost on many.
My (already born) children will learn to code, not because I want them to be developers, but because I want whatever job they have to be simpler and easier and more fun.
>> "But at some point in time, it will become a roadblock to not know how to write a basic script to automate some repetitive/remedial task."
Isn't that the opposite of what's happening? 20 years ago it was necessary to know stuff like that to make full use of a computer. Now we have programs like IFTT.com or the more powerful Automator on OS X which allow us to do those things using a GUI without having to understand ho the scripts work or are written. Why can't we continue building tools so that people don't have to waste time learning the nitty gritty? Computing has been getting easier and easier so much so that 1 year olds and 90 year olds can use computing devices. Why does everyone seem to think that in the future we will all be coding when that has been becoming less necessary as time has passed?
> everyone should know how to go about general problem-solving, but not how to code
General problem-solving ability seems like a synonym for fluid intelligence, which is not very malleable. Learning to code, on the other hand, is possible with effort. I learned to program in the 4th grade, with videos and books I myself bought, without having internet access. (I could use dial-up if I really needed it, but it was expensive and slow, so I used it very sparingly; I don't remember how much, but around 5 hours per month seems an upper bound.) I had no support whatsoever from anyone (except that my dad paid for the books and videos), my mom only let me use my computer for like 3 hours a week (shared between gaming and doing anything else), my computer was old and slow, ... . Now, I sure have a high IQ, but I doubt that we couldn't have 20% of the urban population reach some basic computer literacy when they are 24 years old. Heck, calculus is known by more people than coding. Most non-poor people waste 16+ years of their life in K12 and undergrad, and learn very few useful skills. Imagine what would happen if we taught people a curriculum that did something other than pure signalling.
> On one hand, I understand that blaming the tool isn't a good attitude to have. On the other hand, my job consists in building tools for other professionals, and I feel like I have way higher standards for the tools that I produce compared to the tools that I use.
I think your view is of this is reasonably balanced. There is that element of someone without extensive experience not knowing what they don't know.
Well, can we blame them for that?
Thirty years ago, probably not. Today, I think the answer could be yes. A few days of time well spent web searching, reading and watching videos can bring someone from complete ignorance of a subject to having a very good starting point from which to grow. Today there's information on almost anything anyone might want to learn, free and widely available. What, generally speaking isn't widely present is the willingness and dedication to learn.
I have friends my age who stopped learning twenty years ago, maybe even sooner. They just don't care enough. Or maybe they thought they were safe and did not need to. In at least one case I know, that was a huge mistake. He started life as a field service engineer with great prospects. He never bothered to learn anything new. Today he sits in a trailer at an oil field 24/7 manually logging various pressures and temperatures multiple times a day.
I also blame the educational system for some of this. Maybe I was fortunate to have gone to school when I did. We started with assembler. Actually, machine language, raw 1's and 0's. By the time I learned C I had designed a few industrial control computers and fully coded them in assembler. The transition to C was very easy. And nobody had to tell me where the dangers were...because, coming from assembler, it was obvious.
> "Why does it have to be so complicated? I just want to install a program"
> "Why would you do that in the command line? It's way easier using $Program"
A concerning observation that’s slowly dawning on me is that more and more programmers don’t know how computers work. They can write code, build software, and do lots of useful things. But they have no idea how computers work. They’re more akin to lusers as we used to call them than they are to hackers of old.
Fantastic at their specialty and the tools they use. But move a button to an unfamiliar place or throw them into a new (but fundamentally same) environment and they’re lost.
The realization came a few weeks ago when someone shared The Missing Semester of Comp Sci on HN. It’s full of basic things you’d expect any programmer to somehow magically know … but they don’t learn this anymore. https://missing.csail.mit.edu/
Seeing that link shared connected the dots in my mind. I’ve been wondering for months ”Why does everyone at work have so many random local environment issues all the time?” … it’s been working fine for me for years. Same code and hardware. ¯\_(?)_/¯
Maybe I’m just getting old and grumpy. I’m sure folks older than me thought “wow kids these days know nothing” when we came on the scene …
> Who hired you?
Pre-internet, nobody. Professional computer nerds did hardware. I was a screwup kid only interested in bad ideas from other screwup kids with modems.
> What did you work on?
The internet. We had dial-up BBSs and some pay services, mostly to distribute text files. I spent most of my time writing peer-to-peer networking, distributed file storage, and p2p content discovery.
> How did you Learn?
The book that came with the compiler and/or IDE was usually excellent and all you needed. I tried reading Dr Dobbs, but that was usually closer to hardware, and way over my head. There were all these books full of prose that I didn’t care about. I just wanted code to get things done.
> How did you fix issues?
Change. Compile. Error message. Repeat.
> How did you find talent?
Kids that swapped floppies at school. Started with txtz, then pr0n, then warez, then c0dez.
> War stories?
Lots of screwup kid stuff learned from txtz like making bombs and drugs. Password stealing worms. Eventually pioneered the click fraud worm at the beginning of the internet and that was the end of it for me.
> The real question is why do we let people to abuse technology instead of using it well.
I am (still!) amazed by the gap between how I see a computer and how most non-geeks around me do. To them, it's like a set of different, often frustrating tools rather than, well, an amazing piece of technology that can be prodded to do whatever you want (to the point where an iPad with few to no apps can provide them with all they need).
To me, there's a joy to finding just the right framework, app, or library that can do what I need done. To the point where I might end up enjoying the process a bit too much and get nothing done. To them, it's all about getting stuff done as soon as possible, with whatever they're familiar with. And since they find computers often quite frustrating, they will abuse the hell out of the little bits they know.
And so, in the same way that we would use a lighter to open a bottle if there isn't an opener nearby, they 'abuse' technology by using Excel for everything, or storing their notes in an open notepad window without saving. And as long as that works most of the time, they feel absolutely no incentive to figure out a better way.
Until, of course, it all goes wrong. Then they call us to fix it :-).
> how will people know that this is what they need to do?
Why has the tech community at large completely failed to educate people on the basics of something used by billions of people daily and affecting increasingly-important parts of their lives?
Is it because we keep infantilizing them the way you seem to be doing, pretending they are too stupid to learn anything new? Or is it because in an effort to simplify, everyone simplifies to a different way, resulting in even more complexity? Or is it because tech really doesn't care and sucks at explaining anything?
A bit of all three (and probably more) if you ask me
> That he's never met anyone in 10 years to tell him drag & dropping is not programming is one thing.
It's not drag and dropping, it's using a 'visual programming' tool that the environment encourages.
> The problem is not that he's using an IDE, the problem is he's not a programmer.
I wouldn't disagree, I like IDEs myself. I even use automatic completion in my word processor. The problem is that he never learned the basics.
> Bloody luddites clinging to their consoles.
It's funny how things always come full circle - search is the big thing now, everyone has an interface based on search. Windows 8, Linux, OSX, Android, iOS, everyone.
We enter http addresses into our browser, and search for things using (gasp) text, but using the command line makes one a luddite. Great logic that...
> Personally, I think there's a massive economic explosion waiting to happen in making programming tools more accessible to average people (Wordpress/Squarespace/Salesforce/etc are beating on that door, but no one knows how to get through it yet).
If you look back, that was always the hope when personal computers got introduced and has been ever since. It never seems to pan out: BASIC on micros didn't do it, Excel/FoxPro/Access didn't do it, HTML didn't do it (RIP Geocities).
That's what worries me. There was a time when I could write low level C and even some assembly. Similarly there was a time when I could explain the ins and outs of IP packets and routing. However, none of that is what the market wanted and I don't remember any of that anymore.
Heck, with most of my work involving running around trying to hold understaffed systems together and juggle the unreasonable demands of management; I don't even have the programming/web development skills that I used to.
Even working in IT this is an issue. For something I used to do we needed to take a bunch of input and turn it into a basic csv before it could be used. For example, feeding a bunch of servers names into some kind of script. People kept coming to me to help them put this together, because I was decent with a text editor. Eventually I decided to make a little website that let people dump in all their garbage data; it would clean it up and format it in various formats, depending on what they needed it for. I figured one or two people on the team would get some use out of it. How wrong I was. It was being used hundreds of times per month. I was attending a demo for some other team a couple years later and someone on the demo used it to get his input in order before using whatever he was trying to demo. Some of these seemingly basic things can be huge for enabling people to do their job effectively.
I always thought I shouldn't have gone into IT, but something else... anything else... and simply used computer skills as a super power. The productivity someone with some basic computer skills can have, compared to the others who don't, is crazy.
reply