This comment fails to see the real reason that Apple, Google, and Microsoft lock down their hardware and software, ease of use.
A vast, vast majority of the population could care less about learning about interrupts, tinkering with BIOS settings, etc. They want a device that is easy to use, and as simple as possible. So companies therefore abstract away 90% of the "creativity-inducing" components because a computer is a lot less intimidating when the user clearly knows what they can and can't do with it.
Apple and Google are not sitting in a meeting room pitching the best ways to stifle creativity. They are selling a product to a population that wants their highly complex device to be as simple as possible.
From the earliest days of computing, hacker types have regarded ease of use and simplicity as indicative of stupidity and as a "dumbing down" of computing. This mentality survives to this day in the fetish that so many have for complexity and the fact that UI/UX is often the last thing to get attention in designs led by programmers.
Apple meanwhile took the opposite stance. Computers should be easy to use. Complexity is bad, especially at the UI level. Things should "just work."
Calling bullshit on the sentiment in that Jargon file entry above made Apple for a time the most valuable company on Earth.
The tax we now have to pay to them is a dumb tax for the cult of complexity and "real men do it manually" bullshit. To break the App Store and Apple semi-monopoly, start not by criticizing but by asking why it is so successful. Only by answering that question will any challenger including FOSS be able to successfully compete with it.
The entire point of computers is to make things easy.
The Apple question "why isn't this easy" is missing from 95% of UX in modern software. Stop acting as if software devs are the only users of software. And even then: computers should do work and not produce work or mental overhead.
That's exactly the mentality that keeps 'users' stupid.
Apple is basically saying: "users are stupid, let's protect them from thinking." Imagine if we said that about our kids: "Our kids don't know how to handle the real world. Let's protect them from having to deal with it." (never mind that most parents do in fact go through this phase).
The reality is that users are just kids who haven't learned how to use computers. Two factors make this hard: most computer software is generally poorly designed (speaking in terms of the number of poorly designed products vs the number of well-designed products), and most software is not designed to teach users how to use it.
Apple is busy buying fish for starving people. The company that teaches users how to fish is the company that will win big.
Most people don't want to reprogram stuff, they don't want to customize stuff. They want things to "just work", they don't want to choose which button controls the car windows and which knob controls the bass. This is the job of the designer. Most people want a finished product.
People don't want to "rewire" their products, they don't want to hack on this kind of thing. Except for geeks and technical people, who can also handle normal apps and interfaces.
The analogy with the Internet breaks down because the Internet is mainly successful because it connects people. That's what people care about, other people. The telephone was a success because it lets you talk to other people.
Unless this thing helps people deal with other people, it's too complex and uninteresting for the masses and probably too simple and tedious for technical people.
The problem isn't that most people don't care to learn, it's that platforms become more and more locked down to the point that people can't learn. Or even if they can, the device is so "sealed" that they can't dive in even if they knew how.
A big part of the reason I got into computers and software development was because I wanted to know how things worked. And I could do that: I could dig into things, write simple programs, take things apart, put them together, all that. Young minds these days have fewer opportunities to satisfy any curiosity they may have about these sorts of things.
That's not about keeping users stupid, it's about not making them worry about stuff they should not worry about.
The user may be the world's best neurosurgeon, does that mean that they have to learn about filesystems?
most computer software is generally poorly designed
You are right about this one. But the thing is that Apple does exactly that: offering well designed software.
It's not about buying fish, it's about hiding unneeded complexity. How do you drive the car: press the gas and it goes, press the brakes and it stops. Turn the wheel to the right and it turns to the right. You need zero knowledge about what's going under the hood.
Now take the iPad: tap an app and it launches, press the home button and it stops. Swipe to the right, swipe to the left…
I think the big part of the problem is that people don't know what their device or software can do and what is possible to do with it.
Once you show people how to use a certain feature they get on quite easily.
Insane amount of productivity is lost because a lot of people don't feel comfortable with their digital tools and think they have to live with whatever default settings are.
I'm not saying this is the case, but I think maybe the difficulty the tech crowd has with these new "easy" interfaces is exactly that, they are easy. If everybody can easily use a computer (tablet, phone etc), never looses their files, never picks up a nasty virus, then they don't need us any more.... We lose our exalted position of power as one of the enlightened who "know".
Don't get me wrong. I regularly drop back to the command prompt, and hate having the flexibility of a full system taken away from me. But I also hate that feeling I get when random relative calls up to ask for help with their latest tech disaster. I hate it that configuring operating systems so that they are reliable, robust and functional is almost impossible. I hate it that the operating systems still get in the way of just doing the job.
We're hackers, hobbyists and enthusiasts, we want the computer to be in our faces, that's what we do. The vast majority of users however, hate it when they have to put up with the crap we think is cool just to read an email, or write a document.
Can't remember who the quote is from but it goes something like this,
Technology goes through three stages,
simple with limited functionality,
complex and unreliable,
simple and reliable....
I have a feeling that we have started to reach the end of the second stage.
"Tinkerable" and "steep learning curve" are orthogonal.
In fact, what the article's author is decrying is that this thing without a steep learning curve is completely un-tinkerable. That, per his argument, is what will cost us some chunk of the next generation of programmers -- that, in pursuit of ease of adoption (and/or control), Apple has lopped off tinkerability.
You're absolutely right: if I want to tinker, I have more options available to me today than anyone ever has before. But it's not me that's being cut off from tinkering. It's the novice computer user, who's only beginning her journey of discovery into the possibilities these incredible tools can offer -- because the tool she has in front of her, as it's been given to her, explicitly excludes those possibilities. She doesn't know what Arduino is; she doesn't even have a concept for it. As far as she's concerned, FPGA is something a golfer might join. Those are things for people already at least knee-deep in tinkering.
All she knows is, "Wouldn't it be cool if my iPad had ... ?" or "Wow, I wish I could ... " And the tool she has not only gives her no ability to explore those possibilities, it looks like it's designed to actively impede her from exploring them.
The point remains. Having to jump through a bunch of hoops is the opposite of user-friendly.
And to take the argument more broadly: how many of us here on HN became interested in computing because they screwed around on their PC as a child? What about today's children, who get a smartphone, a tablet and maybe a Chromebook? What are we teaching them?
This is an excellent and understated point you've made.
It may be the trajectory towards 'simple and accessible' via simplicity has crossed a line that's moved due to people's increased knowledge of computers. It's moved and its stepping on people's toes. The vendor lock-in strategy via their ecosystem trappings just amplifies the perception.
Not the poster you were replying to, but I’ve come to the same conclusion and here’s my not-particularly-rigorous reasoning:
It seems to be equal parts users stuck in a local maxima of computing skill and how that enables lax software engineering standards.
When’s the last time you’ve sat down with a user who isn’t remotely interested in tech and watch them work/use a computer? Most of the population’s mental model of a computer is starkly different to the average hacker news reader. You can still hear the same complaints about how computers “don’t do what i want it to do” that i remember my parents generation saying, and they were experiencing the first waves of computerisation in their offices.
The story became that the older generation just couldn’t understand the new generation, but kids are amazing with computers because they’re growing up with them. Well, some of those kids are just as hopeless. It’s partly an education problem (hard to learn computing from a teacher who doesn’t understand it themselves), and partly because UI design trended to simplifying everything as much as possible so that users who don’t understand computing can still enjoy and use their devices. Now there’s not a great incentive to learn more than you need to just use the UI you’re given, and computing skill tends to get stuck in this local maxima.
I won’t go on about my other point in detail as it’s a perennial favourite for hacker news discussion. But hardware gets faster so quickly, but our software is so hastily thrown together that it eats up all the gains. Users don’t notice that software they’re using is crap because their mental model of computing isn’t developed enough to know what’s happening. Instead we get this casting of devices as somewhat malevolent entities (“ugh, my stupid computer keeps losing my stuff. I need to buy a new one that isn’t so dumb”)
We use to think this would be resolved with time and generational change, but it seems like there’s just a more-or-less static percentage of the population that just doesn’t get computers. (Which is completely understandable, people have different interests, it’s hard to inculcate an appreciation of something in your entire population, look at peoples relationships with mathematics)
People in this age are expected to read and write, but using a computer in powerful ways is considered magic and unobtainable, this is to everyone's detriment.
I think that's just it: it seems (and for most people, actually is) far harder to learn to work a computer in a powerful way than it is to learn to read and write. The number of facts you have to know to properly administer a modern OS is staggering. One could reasonably argue that the iPad has gone too far in the "easy" direction, but it seems difficult to argue against the idea that computers thus far have been "too hard". If we can provide the ability for people to achieve powerful results with their computers without having to learn the intricacies of their OS/hardware, we're doing good and unleashing precious talent. Every minute spent (by anybody who's not a sysadmin) dealing with file system errors and driver conflicts is a minute not doing productive or creative work.
It's an industry generality. Aesthetics over usability, imagined users over real ones, minimalism over functionality, naive users over power users, wizards over tools.
If you want something used by millions of people, you have to make it as simple as possible. Insisting that people learn about the invisible internals is more likely to put them off using it than to result in them learning.
Simplicity is what strikes the brightest of minds, some prefer to have simple shortcuts and a tone of sh*tware that prevent the user from knowing how to use it's tools (git, configuring an IDE), some prefer to have something they can master over the years, is standard in the industry, and that doesn't induce tons of un-controlled behaviours.
People don't have enough time in their life to learn about everything they touch and need to say no to some things to get anything done.
Maybe those professionals want to learn tools for their actual job instead of fighting with the platform underneath. Just like artists want to draw, not learn how to fix their graphic tablet's broken driver.
And I know how to do things from the command line faster than some people using a GUI, but this elitist way of thinking needs to stop. Consumer devices and their features should be easily accessible to everyone regardless of their tech skills.
That's why Apple is a multi-trillion dollar company. Because people want the easy way. If things require extra apps, extra steps and reading tutorials/instructions to use, you can bet most people will stop right there.
I'm not disagreeing with you, I'm just saying how things work for the masses.
History has shown that user experience is the most powerful force in computing, at least at the consumer level and increasingly in other areas too. People will trade privacy, security, cost, freedom, openness, and virtually any other quality for ease of use.
I hypothesize that the reason for this is time poverty, not lack of expertise or desire to learn. People are more time-poor today than even 20 years ago. Even people who know computers well and could figure out how to use more DIY systems do not have time to do so.
I have a rule of thumb when designing systems: each step required to install or use something halves the number of people who will try it. If 1000 people discover something with a 10-step install, only 1-2 of them will actually try it. Remove a few steps and that number doubles a few times. Most successful "viral" apps have three or fewer steps.
Decentralized, federated, and generally more open systems have been consistently unable to deliver anything close to the ease of use of vertically integrated systems.
A vast, vast majority of the population could care less about learning about interrupts, tinkering with BIOS settings, etc. They want a device that is easy to use, and as simple as possible. So companies therefore abstract away 90% of the "creativity-inducing" components because a computer is a lot less intimidating when the user clearly knows what they can and can't do with it.
Apple and Google are not sitting in a meeting room pitching the best ways to stifle creativity. They are selling a product to a population that wants their highly complex device to be as simple as possible.
reply