Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login
The Distribution of Users’ Computer Skills: Worse Than You Think (2016) (www.nngroup.com) similar stories update story
47 points by _Microft | karma 35603 | avg karma 6.98 2023-07-06 13:29:29 | hide | past | favorite | 63 comments



view as:

A related anecdote - i’ve observed that computer skills in successive generation cohorts is dropping and dramatically. I’m a millennial but was fortunate enough to have grown up with parents who had no idea about tech but gave me tools and manuals to learn. Most of the books i read before the age of ten were computer manuals. they were in the trash and easy to access. Tech was also fragile in the 90s and 2000s so the fiddling led to higher understanding.

In stark contrast are gen z and gen alpha (?) where many i’ve interacted with do not understand folder structures, startup sequences or even how to type with more than two fingers. They are excellent at typing on a screen with their thumbs but writing an excel macro? It is completely alien. navigating folders? A struggle.

As tech matures, skills are dropping and alarmingly so.

Here is the original report: https://www.oecd-ilibrary.org/docserver/9789264258051-en.pdf...


It's interesting the differences between gen z and older generations when it comes to computers. I'm a cusper (millenial/gen z) so I grew up with computers, just like my parents who are gen x, so we all understand it just fine. But I'll help out someone who can't be any older than their late 40's with a $2,500 laptop, who almost started crying when I opened a new tab in her browser because she thought I deleted it. She didn't understand the concept of tabs. Or just last year when I met someone who had never owned a laptop, and returned their school laptop and bought an iPad Pro instead because they understood it.

It's so strange, I feel like I'm in this weird middle ground of understand desktop and mobile OS's just fine, but the older generation and the younger generation are siloed into one or the other.


As someone on the cusp from X to Millenial, I get mobile fine, but the loss of screen space is why I'll never prefer it. And this is made worse by the natural degradation of eyesight as I get older.

I did have to make an intentional effort to become comfortable on a mobile device, though.


I find I'm almost mad when I have to actually do something on a mobile device. I can scroll through feeds just fine but if I need to start finding answers to questions, or editing text, it's just obnoxious and I'm mad the whole time.

I can type reasonably fine, but anytime there’s a skeuomorphic control (like a joystick) it all goes to hell. The screens are too small and my thumbs too large to get anything resembling accuracy. And so many huge developers are unwilling to put in the effort to support external controls.

Smaller fingers and better eyes are what’s required, I guess. So, not me.


Meanwhile I have small hands and have trouble with modern phones being too big. Have to use them two-handed or risk dropping the phone as I shift my grip around.

I can only one-hand my "standard" sized iPhone comfortably with a pop-socket. I love that they make those as mag-safe attachments, as an aside. Otherwise, I'm with you on two-handing it, even with my large hands.

On the editing text at least, I recommend one of the soft, foldable or rollable keyboards with bluetooth and a phone mount. Walked India and Nepal a few years ago while writing posts and software when I hit hostels or hotels and it worked quite well. There's some fairly lightweight ones (also fairly cheap) that also have touchpads, which may help with the feeds and finding answers.

We make products that intentionally hide any aspect of computing from the user, make up branded terminology for what they do, and vaguely threaten people with the police and financial ruin who dig into them or try to use them in a way that they weren't intended to be used. We do this in order to lock them in. Over the past couple decades we've done our best to keep users ignorant and fearful of switching from our products because all of their skills and data will be lost. I think it started with the "ribbon" in Office, but it was full blown after the introduction of the iPhone (and Android, later.)

A reckoning is coming. You can't intentionally make an entire generation stupid and not feel the consequences eventually. Who's going to teach their kids?


That's because our goal is to make it easier to use. Calculator also hides how multiplication works from the user, it just provides the result.

> A reckoning is coming. You can't intentionally make an entire generation stupid and not feel the consequences eventually.

There are more things to learn today at school. They teach feelings for a start, that didnt happen in my day which perhaps explains why skills in computing are dropping. Not only that, people are no longer concerned with what part of town are they going to live in, settle down etc etc, now they consider what part of the world they want to live in. The constraints of family & school friends, region and country are being battered by globalisation which is mainly being driven by the internet.

Watching the BBC news, here in the UK and you could be forgiven for thinking the population in the rest of the world are just criminally insane terrorists, but thats what the UK press portray the rest of the world as to the British people.

I think they are using Project Fear to keep people here, in a desperate attempt to maintain GDP amongst other things.

So all in all, beside computing, today I know more about finance and biology than I would have ever learnt if I didnt have a job that gave me easy access to the internet, and the GP's hate me for it, when I have to deal with them, rattling off info is like a game of top trumps, until they lose and storm off! But they only know what they have been spoon fed, look at Wimbledon champion Andrew Murray, if his nutritionists and the NHS were any good, he wouldn't be the only player with an artificial hip playing tennis at Wimbledon, and yet for some mad reason, people think thats a good thing and something to be proud of!

I'm sure the women who were portrayed in the Hollywood film Hidden Figures have probably lamented how the skills to land a man on the moon have been lost to mainframes, its a human trait to point out problems and then discuss them, but when will we be discussing these things with AI's and get credible advice back from the code?

Is it happening right now?


When you don't have to struggle to learn the basics of how to use something, you won't develop the mental model of operation you can carry forward.

We (Late X, millenials) did have to struggle just to operate a computer. We had to build a mental model to work around inadequate tools in the OS and beyond.

Younger people have had adequate tools to interact with computers, so they've never been forced to build that mental model - to build an intuitive understanding of how a computer works. So, yeah, it's not too surprising to me.

They can absolutely learn such models, but it's going to take more effort and intentional digging for rough spots to do so. For the most part, it's not like we can just give them a '95 PC with Windows 95 and say "go for it."


this is an interesting angle.

on the topic of tools, i want to add something i rue - the loss of printed instruction manuals. My iphone can do a lot and has many features but most are useless to me because one - there is no printed manual i can use (a screen is a sub-par replacement) and..

two - the discoverability is terrible. With more things moving mobile-first, the discoverability of features is just gone. there’s a delicate balance between the capability of software and the east of use. TeX comes to mind a a good example.


> the loss of printed instruction manuals

Even digital `man` pages would be nice sometimes. Not the watered down "Help" pages.

> With more things moving mobile-first, the discoverability of features is just gone.

I'm on the fence about this one. Curiosity will drive a lot of discoverability, but the "locations" of features changes so quickly that all your prior discovery is rendered moot.

"I know this feature exists but can't find it," is little different from "This feature doesn't exist."


Consult the geniuses (no pun intended) that patrol the Apple help forums!!

But yea it’s terrible. Whether intentional or not, ignoring “power users” cuz they’ll figure it out regardless vs coddling the rest and not overwhelming them sucks ass


Fixing houses, crafts of different sorts, knitting, cooking etc. have always been old peoples game. Computers will be the same. There was a moment in history young people were on top, but in the future if you have trouble with tech your best bet is to find the oldest computer person you know and ask for help.

It’s insane. I don’t wanna generalize but anecdotally (I’m gen z or whatever is younger than millennial), it’s staggering how illiterate most peers are. Several times I’ve explained memory in the “u ever have a bunch of tabs open, Spotify, and some game, and now it starts buffering and ur fans go nuts” very simple way and it’s enlightening to them.

And not even having the willingness to learn the basics of these tools you use for hours daily breaks my brain. The profit optimization which tries to drive the barriers to entry for usability to the ground are a big problem too. Removes incentive/requirement to learn about it for those that needed it.

And personally I cant stand using my iphone while my gen seems perfectly content with navigating pop ups and unblockable mobile youtube ads and profiling and no customizable workflow etc. I find it sad as fuck that tech is trending toward the masses blindly accepting what’s served to them but I guess that’s the way most things have gone.


I think this is interesting and come from a like background as you. Building machines and discovering our router had wifi on a Sony PSP. Some thoughts: 1: Boomers and Gen X kinda screwed later generations with incredibly awful computer education in schools. Most schools are basically marketing firms for Google and Microsoft Office. Kids are just not taught the basics at all and instead are handed endless document tasks that are absolutely meaningless to real life.

2: One can argue that all of what you define as 'tech skills' are actually just legacy thought processes that have gone by the wayside. Folder structures are often silly on modern devices with great search. Startup sequences are unneeded to be known because their modern hardware solves it's own problems when things go wrong (ChromeOS for example can just re-image and restore files/settings automatically upon boot failure). They know how to type with their thumbs because the full size keyboard is not relevant and once again, schools don't teach it. Microsoft will gladly have GPT 4.0 write a macro faster than you could.

3: I would also bet that later generations are better at some things. Notably handling the social web, personal image, and expecting to deal with scams and fraud. Our generation got thrown head first into a world where everything is there always. Where your past follows you. They grew up with it, it is native. They know to be more careful in sharing. Their bullies sit down at their dinner tables and on their nightstand because the internet and social networks exist after school hours. General smartphone use skills are likely better. Gen Z is also, according to other reports, less likely to smoke or drink and generally more willing to make stable life choices.

It's easily to point the finger at a next generation but I think A: The problem tends to be the prior generation's fault. It is wild that so many generations will say 'kids these days' but it is their kids. B: The lens through which the generation is judged is not actually fair. I would also note this: The generation before us would argue that we don't know how to drive stick, read an analog clock, or balance a checkbook. The world moved on.


>Startup sequences are unneeded to be known because their modern hardware solves it's own problems when things go wrong (ChromeOS for example can just re-image and restore files/settings automatically upon boot failure)

Yes, because nothing ever goes wrong or needs to be changed to fit your own process Silly user, just do it my way.

This type of thinking os anathema to developing computer skills. Computer skills are the knowledge to decompose work processes into logical sets of instructions such that you can utilize artfully arranged sand to do the work for you. That still requires knowing how that sand works at some level, and the thought that went into the tools for you to use.

I will never use an Android/IOS phone to the level I will my linux systems. Why? Because the people who architected Android specifically did so with their own and not my interests in mind.

When you need to read an asinine amount of literature to realize mofos did everything possible to keep you away from the hardware, to keep them from having to have a "support division" because they weren't interested in empowering you beyond in the capacity as a consumer of their service, it gets to point I reject anything going on under that paradigm as legitimate computing tbqh.


Eloquently put. And regarding the iPhone it’s absurd one needs to jailbreak it simply to access their own phone’s file system!

How could the scale be reasonably expanded? "Level 4: can create the tools needed to solve the task"?

PS: shout-out and thanks to ben_w who mentioned this article in the discussion about the Algerian internet shutdowns.


“Can open the command line”

I was tempted to expand that with "without pasting a malicious command in it", with Discord users being tricked into doing this in their Chrome console as the example.

However, most of HN is curl-bashing unverified scripts into their computers and servers...


Lol true tho that might be one step above. Anecdotally I know a lot of people that don’t even know what a command line is so that’s y I went with that

I was quite surprised to read that about a quarter of the participants weren't able to delete an email in their email app (the "Can't Use Computers" group). This was 2016, though, so maybe things have changed drastically since then. Does anyone know of a related, but more recent study?

I wouldn't be surprised if it got worse.

Most people are on phones now, who uses computers if not for work?

But I guess one could argue using phones would be the same, but even then my personal experience is that most people don't know much about phones either.


Phones are pretty bad for the group who are scared of breaking something and so don't just try the first thing they guess.

Delete: maybe tap and hold, then three dots or an (i), maybe swipe right/left/either.

Close an image: probably pull down from the top but not always.

Undo: Shake the phone like an etch-a-sketch???

Whereas Outlook Express, for example, looking at screenshots, had a big red X with Delete written underneath it in the toolbar. And your mouse had a physical button dedicated to stuff that's now hidden away behind long-press or swipe or the (i) or whatever.


I had the same perception, but if you read the paragraph closely it says that folks at “Below Level 1” can delete an email, and that’s about as complicated a thing as they can do.

I was confused by that, too. But I think there are two classes here: The "Below Level 1" group, that is capable of deleting emails, and the "Can't Use Computer" group, that is unable to do so. The latter group contained 26% of all participants.

Ah very good, now we are all cleared up :)

I expect that this includes people with disabilities (physical, learning) that prevent computer use, and that it’s a large number of people?


Does anyone know of comparable, up-to-date research?

Even amongst engineers some concepts cannot be understood until you decide to “implement a replacement” from scratch. Around 60% of the way, you figure out your design and coincidently understand the design of the thing you previously didnt understand.

However during this journey, you may discover a better method. The other group also came to realise this, but since they invested so much into the origal design, they continue down that path. True refactoring only happens when disgruntled former employees decide to quit and reengineer the solution, this time they create a better product.


Totally agree, and well run organizations will give authority to new teams to rebuild existing products, learning from the mistakes of the past while documenting their decisions along the way.

Sometimes it's better, sometimes it gets bogged down by the "second system effect" and is worse lol

It's the tools that suck, not the users.

If you have two folders open in Microsoft Windows, and drag an item from one to the other... there are 3 possible actions (as of a decade ago) that might happen according to an set of rules the user can't be expected to know....

It might copy the file

it might create a shortcut to the file

it might move the file

This is just a tiny example of the inconsistency of one of the oldest and most stable parts of the UI on a desktop. To coounteract this, I always taught about right-click and drag/drop to actively select which was to happen.

Throw in a phone, or the internet, and the UI de jure force through the web... how dare they cast aspersions on the users.... it's not their fault.


That's fair to say! But I think the takeaway is less about "who is to blame" and more about "what should you ask of users?" No matter what the precise cause, most users will struggle to do anything complex, and you should design your products wi†h that understanding.

[dead]

> there are 3 possible actions (as of a decade ago) that might happen according to an set of rules the user can't be expected to know....

At least the icon changed based on which action was going to happen. Knowing to look for that icon change, on the other hand...


You can also force a specific action by holding shift, ctrl, or alt before dropping.

Or by dragging with the right mouse button.

How does someone who is not interested in computing supposed to know this?

That interface isn't designed for them. Not everything has to be for everyone.

The attitude that an interface has to work for novices as well as experts is one of the reasons that interfaces keep getting worse.


I found it by accident. Either by noticing the icon changing as I hit modifiers, or by coming across it in a thread just like this one.

As noted above, discoverability of this function sucks, so “accidentally“ is about the only real way to find it.


>according to an set of rules the user can't be expected to know

This is consistent, it's just never properly communicated. If the source and destination are on the same drive, dragging executes a move by default; if the source and destination are on different drives, dragging makes a copy by default.

macOS has this problem too, where you just flat out can't cut/copy/move files with the same shortcuts and verbiage you'd use for text (or the option in the menu is greyed out for no reason).


What's a drive? And how, and why, can a user be expected to understand this?

Up until about 2010, this was basic user knowledge. That's like asking 'what's a web browser' which, ironically, in 1995 more people would have known what a drive was than a web browser.

What is this actually measuring though? Half of what makes these tasks difficult seems to be the lack of an explicit target. Failure in that task suggests more about critical thinking and reasoning skills than "computer skills", which is suggestive of something much more specific...

> You are not the user

This is where you will find your true 10x'ers. The developer who is so intelligent and experienced that they can accurately emulate an entire "level 1" user in their own brain for purposes of testing a UX/UI concept. Putting both a really clever design and a really stupid user into your head at the same time is very challenging. But, if you can figure out how to pull this off, you can rapidly arrive at designs that are intuitive for all without a bunch of painful iterations.

I've definitely seen the 10x that can't do this. Their UI design concepts usually turn out to look like something from a 4x game (and not a 'casual' one like CIV5). A user interface that exactly 1 person could enjoy using. About as clever as it gets and equally as worthless.

Try constraining the hell out of yourself next time you build something that a typical person would use (i.e. any B2C product). Ask things like "If I didn't know how to read, could I still find my way around this interface?". Clearly, this is too low a bar to effectively use some applications, but simply trying to aim in that direction over and over can help evaporate the more "clever" concepts in favor of your less skilled users.


Why is the developer responsible for UI design?

I certainly think developers should be involved in UI design, but I don't think they should be the main decision makers in most cases, precisely because "putting both a really clever design and a really stupid user into your head at the same time is very challenging," and I wouldn't expect it to necessarily correlate with programming skill.


The developer must be fundamentally responsible for all aspects of the final delivered solution. Diffusion of responsibility will otherwise tend to lead the project towards a non-ideal user experience. If UI design is "someone else's problem", I might be tempted to take certain shortcuts at code time (e.g. assuming there is only 1 view of the data model - how could I know? Not my problem area).

Not saying you can't employ a UI designer who does this full-time, but the final software product and the entire experience around it is still the developer's full responsibility. With this in mind, perhaps we'd like the developers to be more involved with UI design meetings?

All of this gets me back to the front-end/back-end meme that seems to have emerged over the last decade. Same sort of problem. All you do with this "not my problem" game is make your own life harder the next iteration.

So to be clear - I am not saying you have to do the whole product yourself, but I am saying that you need to care enough about it or it's going to turn out like shit. Writing software that people use is still decidedly an art form. Imagine paying to go to an art gallery filled with pieces created by artists who absolutely hate to make art.


I would not put the needle that far over. The person responsible for the total user experience, and the PM are responsible for the overall design.

The developer is not.


> I wouldn't expect it to necessarily overlap with programming skill.

What is programming skill? For loops and writing a recusion function?

If you ever talk to a lawyer, they're not just robots that regurgitate the law, they can can give you good advice on mediation, and alternative means of resolving conflict. They'll aks you things like "are you sure you want to go through this, this is what this entails, etc." They don't just go "I only the know the law, I don't have a background in pschology, anthropology, sociology".

If you're a salesperson, you should know something about the product you're selling and have some communication skills. They don't just go "I am only here to show you to the car we're selling, I don't have a communications degree, I don't know how to offer you a deal".

Sometimes it feels like only in software are people so coddled that they think this stuff is normal. People will hire with leetcode as a strong signal and have stack ranking, and then somehow think that the streotypes of devs only being goot at logic puzzles and lacking human understanding are true.


"Ask things like "If I didn't know how to read, could I still find my way around this interface?". Clearly, this is too low a bar to effectively use some applications, but simply trying to aim in that direction over and over can help evaporate the more "clever" concepts in favor of your less skilled users."

This is very useful for people who are developing UX/UIs for the older population.

A quickly growing segment where a lot of the users want to do things like use tech to keep in touch with their families and grandkids from a nursing home and they may not be as sharp as the average user of tech products...


Users are not stupid. They just don't share your world view.

One of the biggest problems of people using computers is fear. It impedes any exploratory activity and therefore learning how to use a computer. The first thing everyone has to be taught (again and again), try stuff out and don't be afraid to break things. It's just software. (taking care of your data and backups are the topic of another lesson :-)

The issue is that you have the skills to fix it, but if I, as ignorant user A break something, I can't fix it. So I just bricked an expensive piece of equipment that I may not be able to replace, and I may not know who to even talk to in order to get it fixed, and when I do know who to talk to, it's the neighbor who works in something called infosec and I think that's computers but why is he so annoyed when I ask for help.

Or, I can just leave it alone and not tinker with it so that I can still play my match three game and send funny pictures to my sister.


>try stuff out and don't be afraid to break things. It's just software.

When I was taking 3dsMax in college, my professor said something very similar. If you want to try something, save a copy and try it. Nothing lost. I've learned quite a bit in both 3d modeling and programming by not being afraid to break things.

Unfortunately that's not really an option for most business software, and it's also not always very apparent what various functions in business software actually does (in terms of secondary implications). My first programming job was developing tools to ingest reports from various front of the house systems into back office accounting systems. I would have users sit down with me and go through what reports they used, what data they wanted, and pull the sources for that data from the system.

It's amazing how many users would not know how to do much of anything beyond clicking exactly where they expected to in order to do the half a dozen or so day to day functions they needed to perform, to the point of not knowing where individual reports where because they were all run at the end of the day as a report package rather than on demand. Even clicking on menu items to see what the options were would get people on edge. Running reports should be something that's an idempotent operation, but there would still be reluctance. That being said, there also were reports that would have implications when run as they would either close things for the day, or various counters would be reset.

That's a long way of saying "just break things" isn't really good advice for users. Fixing things takes time and skill, neither of which the user might have. "Breaking things" might have far more significant implications that just going back to your last saved file. It also implies that the user knows when things are "broken". The longer it goes that they don't realize things are "broken", the more likely there will be problems and the more work was likely lost. That's also assuming they realize things are "broken", which is a point some users may never get to.


[dead]

I've recently been looking again at microSD cards at the popular electronics online shop in my country.

Every 1 star review I have seen is because the customer has been unable to make the card show up on their computer.

I have to consciously remind myself that it's almost always because they don't know how to format a storage device, not because the cards are faulty out of the box.


I learned to drive a car in the 80s when stick shifts were still common (and preferable for poor people like me because they got better gas mileage and cost less to repair, something you might need to do a bunch of yourself) so I learned on a manual transmission because that was the smart game to maximize my options.

I'm glad that today most kids will never have to deal with the fear of sitting at a red light on a steep incline just knowing you're gonna spin the tires, stall the car, or roll back into the car behind you (while your girlfriend eyes you from the passenger seat.) A not just once, but dozens of times over the better part of a year, just to get to school or a friends house or my summer job while I got used to that transmission.

Today, my car's engine compartment is hermetically sealed and it practically drives itself. Would a kid today be able to answer my questions about or demonstrate how to drive a stick, or pull a tranny, open it up and swap in some junkyard gears? Probably not, but I don't think there's anything wrong with her not having any of those skills. IN fact, I think it's wonderful she should never need them.

Computers, the things I grew up with -- boxes you could build or easily modify to make neat things happen, are lost, just like my then new to me but 10 years used 1980 honda accord 5 speed. Sure, a few enthusiasts may still buy hot hatches with a stick today, but even those vehicles will be gone in a just a couple more years.

The computer, with it's user serviceable hardware and software is dead; long live the iPad.


Legal | privacy