Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login
Why Jony Ive left Apple to the ‘accountants’ (www.nytimes.com) similar stories update story
357 points by mcone | karma 9112 | avg karma 12.1 2022-05-01 08:01:55 | hide | past | favorite | 609 comments



view as:

...an infomercial for the author's new book? Weak sauce, NYT.

I completely cut nytimes out of my life. I don't think I'm better informed, probably worse, just happier. Definitely happier.

Ignorance is bliss, then?

Since when do newspapers equal knowledge?

I quit viewing the NYT as a reliable news source after they sold all those lies about Iraqi WMDs to the American public on behalf of the Bush-Cheney-Rumsfeld neocon crowd. It's not hard to say informed without relying on such corporate media outlets with obvious ulterior motives. For example, if I want to read about Apple's M1 Ipad (I do like ipads, though mostly use Linux now), just go to DuckDuckGo and enter

Apple M1 Ipad "engineering" design

Flip through the first 50 results, you'll get a better view on what's up than the NYT provides, with its drivel about 'the soul of Apple' and similar fluff nonsense. You can try it on Google as well, but then half the page is just ads.

[edit: one thing I discovered by doing that is that the M1 Ipad is going to be very difficult to repair, which is a huge negative for me]


So who did you choose as a reliable news source going forward, then, instead?

I think you need to be mindful that NYT got duped just like Powell did. It doesn't magically mean they are all bad journalists. It doesn't. You can fantasize that it does if you like, but it doesn't.

NYT is not an in-depth tech reporting organ, so I agree with you on that. But if you want detailed global-scale investigations of complex new issues? Good luck, in the US at least, finding a substantially better source than NYT. You won't. It's not there.


"To your request of my opinion of the manner in which a newspaper should be conducted so as to be most useful, I should answer ‘by restraining it to true facts & sound principles only.’ yet I fear such a paper would find few subscribers. it is a melancholy truth that a suppression of the press could not more compleatly deprive the nation of it’s benefits, than is done by it’s abandoned prostitution to falsehood. nothing can now be believed which is seen in a newspaper. truth itself becomes suspicious by being put into that polluted vehicle. the real extent of this state of misinformation is known only to those who are in situations to confront facts within their knolege with the lies of the day.

I really look with commiseration over the great body of my fellow citizens, who, reading newspapers, live & die in the belief that they have known something of what has been passing in the world in their time: whereas the accounts they have read in newspapers are just as true a history of any other period of the world as of the present, except that the real names of the day are affixed to their fables.

general facts may indeed be collected from them, such as that Europe is now at war, that Bonaparte has been a successful warrior, that he has subjected a great portion of Europe to his will, but no details can be relied on. I will add that the man who never looks into a newspaper is better informed than he who reads them; inasmuch as he who knows nothing is nearer to truth than he whose mind is filled with falsehoods & errors. he who reads nothing will still learn the great facts, and the details are all false."

- Thomas Jefferson, 1807 [1]

That's one of my favorite quotes that's becoming more relevant by the day. The reason, among many, is that that was written more than 200 years ago. It emphasizes that the brief window of media that many of us grew up within was some weird brief bubble of competence, integrity, and accountability that was for seemingly most all of the rest of humanity's existence, not the case. And we're now simply returning to the 'good ole days.'

[1] - https://founders.archives.gov/documents/Jefferson/99-01-02-5...


Couldn't follow the link.

It is truncated for some reasons, the document is:

From Thomas Jefferson to John Norvell, 11 June 1807

This one, even if it is truncated in the view on HN, seems to be working:

https://founders.archives.gov/documents/Jefferson/99-01-02-5...


It's so good! I love it!

"defamation is becoming a necessary of life: insomuch that a dish of tea, in the morning or evening, cannot be digested without this stimulant [the newspaper]. even those who do not believe these abominations, still read them with complacence to their auditors, and, instead of the abhorrence & indignation which should fill a virtuous mind, betray a secret pleasure in the possibility that some may believe them, tho they do not themselves. it seems to escape them that it is not he who prints, but he who pays for printing a slander, who is it’s real author."


I see a disclosure right at the very top. And an article adapted from a book, that most of us won't get around to reading, seems like a perfectly reasonable topic.

ADDED: I'm actually more inclined to read an article written by someone who did the research/put the effort into writing an entire book on a subject.


Agreed. Both book authors and journalists face comparable financial motivations to distort facts for the sake of story telling (and in this case the author is also a NYT journalist).

I understand the thought, but a weird reality about book writing is that, since the author bears the cost for fact-checking books, they're often less extensively fact-checked than longer reported pieces for newspapers or magazines.

> It was 2014, and Apple’s future, more than ever, seemed to hinge on Mr. Ive.

What?

I was there then and long before, and that's just nonsense, fabricated nonsense by whoever Tripp Mickle is. What a sensationalist assertion.

The engineering, hard-core engineering, was a not entirely hidden powerhouse, and still is.


That seems like a very engineering-centric view of the world. The engineering (both software and hardware) of Apple's products has certainly been important. But, especially as they were transitioning to increasingly be a consumer electronics brand, design was certainly a key piece that set them apart.

Design still is a key piece that sets them apart.

Agreed, and, to both your points, it's interesting that there is significant engineering never trumpeted in the press about the teams doing engineering to make the designs (physical designs) manufacture-able. it's a super interesting world where mechanical engineering and aesthetic design crossbreed.

> whoever Tripp Mickle is

NYTimes journalist, before that, WSJ.

https://www.linkedin.com/in/trippmickle/


Are you sure? Engineers have no control over the isolated designers

Read the whole thing, I was really not convinced. Ive had some brilliant hits, but I think became more of a liability in the end. Look at what happened to the MacBook Pro, losing most of its ports and the thinness causing them to put a much worse keyboard in it that caused massive problems. Sacrificing a bit of thinness and going back on those changes with the newest iteration has been much better.

Honestly to me the M1 era of Apple is the more exciting than things have been in years. The article linked is really negative (saying Apple only have “legacy products”) but with the M1 series they seems to be smashing it out of the park…


I don’t think that was due to Ive but more so that he couldn’t excel without Steve around.

It's also clearly stated in the article that this was one of the reasons that Ive was given the CDO role. So that he could do less of what Steve did for him.


This. It seems clear Jobs served a “you made it very pretty but it sucks to use” role in feedback.

There are many counter-examples, from the original iMac’s mouse to the iPod Hifi. Jobs said no a lot, and that was a good thing, but he did not have absolute good taste. To his credit, he was good at learning from mistakes, even though he very rarely acknowledged them in public.

Don’t have to get every single one right to still play the role. The man was effective, not infallible.

But then it works both ways: you cannot cite a couple of failures under Cook to say that Jobs was irreplaceable.

You seem to be debating someone else; I haven't said that?

I think Jobs and Ive were a pair that complimented each other. I think when Jobs died, Ive lost that moderating influence, and "thin at the cost of good" and "we got rid of buttons" were the result for a while.


On the other hand, Ive without Jobs got you iOS 7 with all the fine Corinthian leather removed.

> Honestly to me the M1 era of Apple is the more exciting than things have been in years.

Yes, I feel like this is somehow still massively underappreciated. They pulled off a major hardware transition without big hiccups (I'm sure someone's going to point some I have missed in the comments /s) and launched a bunch of devices that are an incredible leap forward. I mean the baseline M1 air starts at $1k and is an incredible piece of hardware for most usage.


The M1 iPad Air is $499, couple that with Apple Pencil, the writing experience is near frictionless.

The M1 era is going to redefine the industry in subtle and not so subtle ways.


Intel’s failing will redefine the industry in many ways. ARM and AMD and other players are taking chunks out of them at the cutting edge. They seem to be redefining themselves as a “couture” fabricator rather than taking leadership on the design end of things … playing off their scale rather than their velocity. It’s a big change and probably has as much to do with why apple ditched them. Remember Apple did this before when they switched to to intel from IBM/Motorola, when they too had stagnated.

Stagnated? Weren't PowerPC chips pretty advanced compared to Intel whose chips were carrying a lot of baggage at the time? Given that PowerPC chips were based on RISC, I'd guess they're a lot closer to the M1 than modern Intel chips are.

My understanding is that IBM/Motorola's struggle with achieving volume is what doomed them not a lack of innovation.

This is all way outta my area of understanding though.


In general, IBM was just going in a different direction with Power than Apple needed them to be going in. IBM was and is focused on the highest end, high priced end of the server market.

If I recall, the scuttlebutt was that Motorola had promised Apple (meaning Steve Jobs) that faster clock speeds were just around the corner for a while, and when that repeatedly failed to materialize Apple (meaning Steve Jobs) got pissed and activated the Intel backup plan.

By the time Apple dropped PowerPC and went to Intel, Motorola was already out of the picture and IBM was making the G5

The G5 was the desktop chip, it was Motorola's task to scale it down for laptops, which they were unable to do.

That's a good point, thanks.

Sort of. The G4 chip was still used in laptops until the Intel transition, and was produced by Motorola until they spun off their semiconductor division into Freescale, which continued producing the G4 until the end.

Well it wasnt really clockspeeds, it was performance per watt. The g5 was able to go into a (watercooled) powermac but IBM couldnt get it to run cool and efficient enough to go into apple laptops (or the mac mini iirc). By 2005 intel was, at the very least, probably prototyping multicore (I dont remember if IBM’s processor offering to apple was multicore at the time) chips that blew ibm (and previous intel offerings) out of the water performance and efficiency wise and apple announced the transition.

The last generation of PowerMacs had dual-core G5s. They still ran very hot and it did not change much in the end.

Last generation had quad core with water cooling I think. They were really trying to get everything out of them.

It was two Dual-Core 2.5GHZ CPUs. I have one sitting under my desk right now. ;)

Pretty advanced for 2005. Four 64bit CPU cores, 16GB DDR2 RAM, and liquid cooled. It's still usable today 17 years later and it could work for 90% of what I do on the computer. It draws nearly 1000 watts under full load tho....


1000 watts, damn.

So what can’t it do? I’m guessing modern websites struggle on it


Yes, websites with heavy JavaScript really slow it down. The newest browser available for it is based on a very old version of Firefox.

https://github.com/classilla/tenfourfox/

Youtube is extremely slow, and it can only play back 360p or lower video smoothly. There is no hardware h.264 acceleration. “New” Reddit is also very slow. But old Reddit loads fine. Hacker news is very fast, loads as quickly as a modern computer.


It was driven by power and heat for the portable market. Intel x86 laptops were the best at the time and PPC couldn’t compete especially with thin and light.

> My understanding is that IBM/Motorola's struggle with achieving volume is what doomed them not a lack of innovation.

Before Apple announced their Intel transition laptops were more than half of their Mac sales. Of their desktop sales, the iMac dominated over PowerMacs. So a majority of the systems they were selling had relatively tight thermal envelopes.

Neither IBM nor Motorola was willing (or able) to get computing power equivalent to x86 into those thermal envelopes. The G5 was a derivative of IBM's POWER chips they put in servers and workstations. They were largely unconcerned with thermals. Motorola saw the embedded space as more profitable and didn't want to invest in the G4 to make it more competitive.

Meanwhile Intel had introduced the Pentium III derived Core series chips. Good thermals, high performance, multiple cores, and 64-bit. It was better performance than Apple's G5 in the thermal envelope of the G4.

Neither IBM or Motorola had general issues with production volume. Apple switching was all about the future direction of the architecture. There was no market for desktop PowerPC chips besides Apple. Neither IBM or Motorola really wanted to compete directly with Intel and saw their fortunes in other segments.

So Apple went with Intel because they were making chips compatible with what Apple wanted to do with the Mac. The first Intel Macs ran circles around the PowerPC machines they replaced with no major sacrifices needed in thermals or battery life.

So Intel innovation in the 00s got Apple to switch to them and a lack thereof got them to switch away again.


> Weren't PowerPC chips pretty advanced compared to Intel

PowerPC had better floating point performance which was important for graphics and publishing workflows. Photoshop performance comparisons seemed to happen at every year's MacWorld during that period.

Unfortunately, IBM used Power as a workstation chip, and making a version of the chips for laptops was not on their radar. Of course, at the time, Pentium IV chips weren't known for running cool either. The more popular laptops got, the more this was a problem.

After Intel transitioned to the Core architecture, Apple transitioned to Intel so they could make laptops with a much better performance per watt than PowerPC offered.


People weren't buying laptops for everything during the PowerPC transition. They were buying desktops. No one doing "serious" work bought laptops in 1994. Not for coding, not for photo manipulation, or even gaming.

It wasn't until the 20-teens (2013 - 2015) that Macs for coding caught on. Apple transitioning to PowerPC made perfect sense for graphics workstations.


>People weren't buying laptops for everything during the PowerPC transition.

That was the period where it became obvious that laptops would overtake desktops and become the most popular form factor for computers.

Neither PowerPC or Pentium IV were a good fit for laptops, but once Intel transitioned from NetBurst to Core it was a new ball game.

Apple even transitioned back to 32 bit for it, since Core didn't offer 64 bit support until Core 2 shipped.


Most students in my engineering programs had macs (2010-2015)

Right, but I didn't start seeing Mac laptops show up at work (software companies in the Midwest and Eastern U.S.) until 2014 when the MBP became a serious contender to Dells and HPs.

I had a Toshiba laptop in '95 for work, but it was a spare for when I was traveling. That pattern continued with Windows laptops as supplements for desktops in the office for the next 20 years. In 2015 my company went all MacBook Pro for everything, but they were trailing not trailblazing.

Students are always going be the leading edge, because they have to be mobile, they get used to it, then they bring it into the new workplace when they come. It's one of the benefits of hiring in new people.


The PPC processors may have been decent. But they were hamstrung by having to run an emulated 68K operating system and Apple cheaper out by having slower buses.

Perhaps in terms of sheer raw compute but across every scale from data centre to IoT and wearables.

PPC processors in wearables and IOT devices?

This take totally misses the mark on the realities of the situation.

Intel made a bad bet on tech and wasn’t able to shrink the node. TSMC got the right choice. TSMC was therefore able to make better tech in the short term. Intel designs are not directly related to that shortcoming.

TSMC makes way more chips than intel. TSMC is therefore able to buy the fabs, equipment, engineers, etc at a lower price per chip (since more chips). TSMC is therefore able to spend more on their fabs, and invest more in research. Intel can’t keep up on the manufacturing side even if they can on the design side. The only way to justify the research costs and fab costs is to amortize it across more chips which means they need to manufacture for more than just intel. It’s basically the AWS model - you can be your best customer, but you can drive prices down for yourself with extra customers. Amazon didn’t abandon the revolutionary 2 day shipping when they became a data center provider. Assuming intel still has good designers left, they won’t abandon their own chips.


Multiple reports over the years indicate that it's not just a single bad bet that got Intel into this, but a corporate culture where engineers have less and less influence and MBAs more and more, leading to worse and worse tech. Similar to Boeing.

Yet Intel is still competitive to AMD, and now has an engineer in charge. Intel's problems feel pretty over stated their products are still good, they're launching new ones which will bring new competition to the GPU space.

I’m personally quite excited for what intel has planned and I work on M1 chips (opinions are my own). I think there’s a decent chance they’ll have a comeback in the next few years. Yes, AMD is doing awesome and ARM is bigger than ever. We’re definitely headed into a very interesting time for processors as consumers.

>. The only way to justify the research costs and fab costs is to amortize it across more chips

or make more money per chip, which is what intel does, since lets say compared to an AMD chip TSMC manufacturers, TSMC takes a cut of the chip for manufacturing and then AMD takes the rest, while intel collects both portions.


In return it also has two sets of R&D to support and two sets of risks - architecture and manufacturing. If it falls behind on either of these it starts to lose.

TSMC for example can solely focus on manufacturing assured that it will fill it’s fabs if it keeps pace.

Maybe Intel made super profits when x86 was the only game in town but that’s not the case any more.


>If it falls behind on either of these it starts to lose.

If TMSC falls behind in one of these they lose, and they don't have the other. Is that an advantage as you seem to put it? If they make a wrong choice like intel did for 10nm they're going to be immediately a non-entity with no 'other' business. Having two sets of money make businesses puts intel at a big advantage in terms of financing and owning their own platforms.

If TSMC falls behind a node all of their orders will disappear to whoever has a more advanced node. They don't have another business. Instead of two risks, they have one risk thats identical to intel's, and their entire business depends on it. That's a lot less anti fragile.

Intel has two sets of risks and in exchange on many many fewer chips they basically made the same amount of money last year, when they were behind on CPUs at almost every metric. That's resilient. People talking about the fall of intel are talking about something that intel is actively maneuvering ahead of. TSMC has no chip design risks much lower per chip profits in exchange.


Vertical integration is great if it generates synergies. It’s really bad if the tie into the internal customer hinders the development of each part of the business.

Intel is not remotely robust as it’s almost completely dependent on x86 and needs to catch up with TSMC. It lost smartphones in part because of x86. Now it’s fallen behind AMD because of manufacturing weaknesses. Hence a P/E ratio of 9 vs c20 for TSMC.

TSMC on the other hand has a huge variety of customers at a wide variety of nodes.


Intel never had smartphones to lose. Inetl can generate all the same advantages as amd by simply buying Chips from TSMC if they want (and they already do for some chipsets) so there is no operational disadvantage. Intel is already mostly caught up to amd and will be making significantly more per chip than amd very soon.

TSMC is competing for something Intel doesn't want to sell. Intel even when it was in the lead wasn't tabbing it's newest process for 3rd parties. You're declaring TSMC Victor in a game Intel never played. And in a handful of years if Intel gains back process advantages you will likely still declare them the loser for not playing the cutting edge fab for other companies game they don't want to play.

TSMC is not playing the same game Intel is, and in 2021 when by all accounts Intel was behind TSMC and and, they still managed to make similar profit to TSMC and laugh at AMD's inability to buy enough chips to make anything close to competition for either Intel or Nvidia.

Now they're also getting into graphics cards and have largely caught uP with amd designs. Their future is bright.


The Intel graphics offerings have consistently under-preformed the rest of the industry, ending up at low end to midrange GPU: https://wccftech.com/intel-arc-alchemist-a770-gaming-desktop...

Intel has consistently tried to build a top tier GPU and failed year after year. Expecting them to suddenly break away from their history is extremely optimistic.


They're just starting to get into the business. It will improve. They don't have to have the best cards, they just have to compete in some segments from the start. It's all upwards from here.

That's the thing about experience - you keep accumulating it.


“This take totally misses the mark on the realities of the situation.”

No, this comment misses the mark.

Intel’s largest issues are not economic or technological.

It’s the bloated bureaucracy that squanders the best and brightest money can buy.


>Intel’s failing will redefine the industry in many ways. ARM and AMD and other players are taking chunks out of them at the cutting edge

Failing? Have you looked at Intel's 12th Gen CPUs? This trope was valid till the 10th Gen 14nm++++ era from 2019 but you might have overslept the last couple of years. Intel has improved massively since then starting with 11th Gen and Xe graphics.

Intel's 12th gen big-little tech really shook up the market and even AMD now is feeling the pressure.


Not to mention, they intend to compete with Apple on transistor density before 2024. Time will tell how successful they are, but I do get a laugh out of the people who are counting Intel out of the game right now. Apple doesn't sell CPUs, they sell Macs. They aren't even competing in the same market segment.

11th gen Intel chips were still 14nm, the top chip had less threads than the 10th gen because of the thermals, and intel xe was, iirc, only offered with the 11900/11900k (i.e. the top of the stack). Intel has had a stranglehold on integrated transcoders for a while but AMD’s integrated vega cores (soon to be RDNA2) still wipes the floor with current integrated XE offerings gaming wise…

>11th gen Intel chips were still 14nm

Nope, 11th Gen was 10nm. You might be confusing it with 10th Gen which was a mix of 14 and 10 nm.


Maybe the mobile products were? Desktop was 14nm:

https://ark.intel.com/content/www/us/en/ark/products/212047/...


Intel adding big.LITTLE ten years after it appeared in Arm is an interesting development.

One could argue it took ten years for Intel to have enough competition from ARM to actually wake up and do something again.

I don't care, I got a 12th gen i7 with integrated graphics (in the weird time window and edge case where Intel was ahead of AMD again for a bit) which is super fast and was way better priced then Intel used to be.

Comperition is good for consumers.


Agreed - I think it’s indicative of a less insular attitude which can only be positive.

Well, desktop and laptop PCs didn't have the extreme power constraints that mobile devices had.

So why are Intel using it now?

The goal posts moved since then.

Which goal posts? Competition from Apple?

No, consumer demand. It takes years to design, test and prepare for manufacturing a new CPU architecture, so Intel had their big-little in the pipeline long before Apple came out with the M1, same how it took Apple over 10 years of iterations to get to M1.

The real question is what is AMD gonna respond with?


It’s a strange argument that customers didn’t want better battery life from their laptops until now.

All credit to them now but lagging 10 years behind Arm in having this is not impressive.


I think you're mixing up some things. I couldn't just buy an ARM chip and plug it in a desktop PC or laptop and plus, the ARM chips, big-little or not, we're terribly underpowered 10 years ago compared even to Intel Celeron.

So calling it 10 year lagging because of a feature that had no relevance in the PC space back then is a misrepresentation.

Big-little made it to the PC market now since modern CPU cores are powerful enough that even low performance ones can still run a desktop environment and browsers well enough without stutters. That was not always the case 10 year ago, so consumer demand was always optimized around maximizing raw performance.

So, the fact that ARM had this feature 10 years ago is larglely irrelevant for this argument

It's ARM's performance improvement at the top end in the last 10 years that changed the landscape for the PC industry to a degree, not big-little.


> Apple did this before when they switched to to intel from IBM/Motorola,

Apple + (Intel) "Core" (geddit)?


> M1 iPad Air is $499

$599, actually.


That's a lot of money to pay for a device that locks you into the App Store.

A lot of money for whom?

To be fair, the iPhone does as well and is way more expensive than that

No it's not. Even the multi-core era failed to do that with laptops. If you're going to redefine the industry you need a radical new idea, not just 'the same, but slightly faster, slimmer, and lighter.'

Whatever will redefine the industry will probably be laughed at and only adopted by nerds for a while. Like OS X back in the day. Only people interested in the first couple versions were Unix nerds. Everybody else's software they needed was on OS 9.


Yeah I was one of them.

Unfortunately the Unix part seems to be very underappreciated by Apple recently so I've already moved on again. I was an early adopter of macOS and have 'converted' many more mainstream users that still use it. But for me it's become too locked down.


Which OS/distro did you move onto?

FreeBSD with KDE

Nixpkgs on Fedora w/ i3wm

Windows + WSL for open source stuff and PopOS for work.

Isn't WSL2 still IPv4-only? That's why I uninstalled it years ago.

Yeah, but you can use a custom kernel and WireGuard to get around that. https://withinboredom.info/blog/2022/04/02/finally-getting-i...

Amen to that. At work one of my responsibilities includes maintaining bootstrapping scripts for MacOS so we can reliably develop on the platform. Getting things to "just work" the way they do on our deploy servers is an actual nightmare, especially once you toss Apple Silicon into the mix. Not only are we running different kernels, but different architectures; it's simply impossible to guarantee that something that runs locally will work fine on production, or vice-versa. I definitely do my development on Linux where possible.

What is it too locked down to accomplish? There are many knobs to unlock it.

Changing the sshd_config to only accept key authentication for example. Since the recent locking down of significant parts of the OS this keeps getting reverted to default.

But there's many more issues, I've gone into them before (I used to be a Mac admin) but I don't want to bring it all up again


You sort of can sidestep the issue by supplying your own launchd plist for openssh, and disabling Apple's one, but it's a thorn in the side anyway — the fact that you even need to bother to sidestep the issue in the first place, while there are systems which go to great lengths to respect your changes to the configuration.

System level environment variables would be nice. It’s a pita to use a yubikey (arguably more secure than a plaintext key in .ssh) for ssh key storage. I remember having to start certain UI programs from a shell just to run them because they needed SSH abilities.

There has been a very nice trend for a number of projects to support a <config_file>.d directory to which local modifications can be added.

Current macOS (and Debian >=11) has a non-standard sshd_config modification that does "Include /etc/ssh/sshd_config.d/*". Although placement early in the config file means some things cannot be overridden.

Current "sudo" on macOS also supports this via "#includedir /private/etc/sudoers.d". (the # has been swapped for @ in upstream sudo).

This neatly sidesteps the need to diff / re-apply changes on a SW update.


This is not always an option, like you say it depends on the cooperation of the base config files (having the include and in the right place) and the tools used.

It won't work foor all cases either. I just want to have the ability to make modifications and sign or bless them somehow with a system admin key. Root is not enough, for understanable reasons. What is possible is to modify offline (through recovery) and then 'bless' my changes. But this reverts after every reboot.

There should be a toolchain where I can make legitimate modifications in a secure manner to system files. Like every other OS has. There should be some kind of user key to sign modifications with. Apple has just ignored this whole toolchain and replaced it with a "just trust us" blanket.


Theres also a lot of commercial service overhead, like apple music starting up on boot (and even being installed) and asking if you’d like to subscribe or whatever.

15 years ago apple had people on stage bragging how there was only one version of OS X while Microsoft had countless versions of Windows (Vista Home, Vista Pro, Server, etc). I wonder if there should be a standard MacOS and a MacOS Pro that would be a relatively stripped down unix environment without all of the bloat thats been added on to MacOS recently…


Absolutely not. There should just be easy switches.

Out of curiosity, what do you mean by “too locked down”? What could you do on, say OS X 10.8 that you cannot do now?

I am still running and compiling the same open source software as I did 10 years ago and more besides. There have been a couple of rough transitions with the new security things, SIP, and whatnot. I disabled it for a couple of releases but now that’s not really a problem.


You have an increased number of hoops to jump through if you want your computer to be programmable.

At first, it was Gatekeeper. Yeah, appeared in 10.8. Then notarization. Now, on M1 you need to sign your binaries ad-hoc or they won't run. Custom kernel extensions? No way.

It's like slowly boiling the water in which you sit. Little things, but one by one they accumulate into quite a lump of red tape, and Apple seems to drive home the point that a developer is a different caste than the user is, and there's some sacred knowledge that you should be in possession of, and update it every year for $99, so that you can program your computer. All the while the user is supposed to be clueless and in awe from this technological marvel the caste of engineers is bestowing unto them.

Oh, and Apple wants to be a bottleneck in each and every process connected to programming their computers. They also remind you that the machines you pay mad bucks for, you don't really own.

I like the pure Unix approach more, when the line between using the computer and programming it doesn't really exist, and where the system is basically your IDE too, and where you're going from being a user to being a programmer and back without really noticing. Mind you, it doesn't mean you have to go from one to the other, but when you want/need to, it's damn frictionless, and the system is as malleable as you want it to be.


Everything you list makes the Mac more secure and more stable.

For instance, with system integrity protection, a bad browser installer can’t wreck your entire computer.

https://arstechnica.com/information-technology/2019/09/no-it...


I like to be in control precisely over how hardened I'd like my system to be.

If I wreck it, I know how to reinstall it and restore my backups, thank you very much.


And you can do that. Just turn it off.

I honestly thing the 'lock down' is so overblown.

Yes there's 'more hoops' - but you go through the hoop once. Seriously, if you're running a dev machine turning off 3, maybe 4, things once and never touching them again is hardly the biggest hurdle.


It's near-impossible to brick an M1 machine. You can always reinstall from a second machine using Apple Configurator.

This is actually an experimental advantage over PCs, which you can brick if eg you erase the BIOS and backup BIOS.


The problem is, not everyone has a second machine. Not being able to install from USB or internet is really annoying from a support point of view.


Try looking at it as a solution, not a problem. In dire straits you can actually recover the machine, vs having no other option like GP noted.

Normally, and for normal users, the recovery mode is just a startup key combo away to re-image the machine.


You know, all it takes to recover a hosed system on x86 is a flash drive. Because the bootloader on those machines doesn’t have to be a specially made macOS partition with a slimmed down macOS on it (hah! Those people are calling Grub2 bloated!) which must live on the internal storage.

Moreover, on x86, even if the internal storage is hosed completely, I can boot the machine off USB/Thunderbolt and have it live for years more. Try that with dead SSD in your new macbook. Talk anout e-waste problem and how „heroically” Apple is fighting it, too.


You can't boot it if you restored the BIOS wrong - it's usually on a writable NVRAM. The M1's initial bootloader is on immutable storage.

I muck with the OS and above all the time, and I somehow can’t remember I ever needed to restore my bios. No. Never. Really.

But M1 macs need the internal storage to work and be intact to boot even from external media. If the internal ssd on my intel mac or dell xps tablet (the soldered one, yep) dies, I boot from usb3.1 and keep on keeping on. The M1 Mac is a brick after that, except the new mac studio where the ssds are somewhat replaceable.


Understood, but I can't wrap my head around why they removed the internet recovery option. Until very recently I managed a large fleet of Macs and it's already happened twice that a user managed to break their system so bad the builtin recovery wouldn't work. Both didn't have another system to hand to do the DFU thing either. Internet recovery as it existed on the intel macs would have saved them a trip to the office.

Different people have different annoyance levels with security restrictions. Personally, I'm all right if Apple's security model makes things I rarely do -- e.g., install privileged system extensions like Rogue Amoeba's Audio Capture Engine -- difficult but still possible. I understand why other people might make different choices.

Having said that, I do roll my eyes whenever I come across the phrase "walled garden" when applied to the Mac in particular, especially when people stridently insist that the Mac is just a year or two away from being locked down like iOS. (I've been hearing that prediction for over a decade, and find it less likely than ever in the era of Apple Silicon Macs.)


They have been warning about Apple requiring all Mac apps to come from the app store since 2011.

If apple wants to ban me, specifically, from running software on my M1 computer now, they can do so. If China or the US government says so, apple will probably comply. You are completely dependent on a network connection to apple to be able to run an M1 now.

If I want to make an app on my iPhone that I don't want to publish, I have to reinstall it every week, and can only install apps with network connections to apple, as apple gives my phone another 1 week permission slip to run code that I have written.

There are no more offline updates, no more offline app installs.

Also apple cares about privacy, except for privacy from apple. They transmit a shit ton of info all the time from their devices to the mothership and know effectively when and where you have been running apps on their computers constantly. They also do so unencrypted in some cases so anyone spying on the network can know too.

You are not the owner of an apple computer anymore, it's apple.

Ultimately in the end, if they really cared about giving their users ultimate ownership of their devices, they would. It would show up in the form of corporate MDM servers which make the ultimate certificate authority the corporate MDM server owner, and in personal cases you could launch and run your own or use Apple's.

Apple hasn't. They are game console computers and macOS is effectively legacy at this point compared to iOS.


How would Apple ban you from running apps on a Mac?

If the Mac were a legacy, why are they spending so much effort on the Mac to bring them all to their own processors specifically designed for them?


You should thank those people. They made enough noise to prevent what was and is surely apple's long term plan.

Yes that long term plan hasn’t happen in a decade since people were predicting it with the introduction of the Mac App Store in 2011.

Any day now…


Still, it kinda is. You really have to go out of your way now to have full access to modify system files and even then you're not able to do just anything you want. Think of installing another OS on the SSD on Intel Macs with T2 chip, or choosing which iOS apps you want to run on M1.

So you have to be purposeful and know what you’re doing to potentially corrupt your Mac…the horror.

Have you ever thought by making it hard for you to corrupt your Mac it also makes it hard for malware?

Apple fully supports installing Windows on x86 Macs and there are plenty of guides on installing Linux on x86 and Mx Macs.


That logic is a bit circular, though, and not very convincing. Apple is known for being opinionated and stubborn about their long-term goals. If they really wanted to lock down MacOS, they’d just have done it, developers be damned.

Or, that's just part of their sales pitch. You know, like how politicians dont like to be seen as wishy washy, its very likely Apple responds to public opinion just as much as anyone else.

Apple made no further moves to “lock down” iOS to force people to use the App Store after 2006. If Apple listened to public opinion, the iOS App Store wouldn’t be the shit show it is today.

A lot of people look wistfully back on the good old days of futzing around with drama in their PC.

Time moves on. If you want computing to be an adventure, that’s what Linux is for.


I hazard yo ask: have you tried Linux recently?

Apple wants to produce customer appliances, not entirely but mostly locked down in the name of security and smooth customer experience, and there seems to be a large market for these.


That was exactly my point above. They've moved on to a new market and the Unix power user market I'm part of is no longer in focus with Apple.

As somebody who writes code most days, and is constantly downloading and compiling others' code, your comment doesn't sound like we are in the same platform, even though I also use macOS. I wonder why our experiences are so different.

For example, I don't have a $99 developer certificate, and am not sure why I would want or need one.


Try and distribute binaries (not source) that other people can run (not compile) and you'll quickly find out why.

There's no overhead to ad-hoc signing. The linker does it by default and it's "ad-hoc" - it's literally a checksum, not a secure signature.

You might complain about MAP_JIT but that's pretty important for security.


The developer/user caste split is not unique to Apple. That's just how every computer has been used ever since we stopped booting (micro)computers straight into ROM BASIC. The moment you have people using software they did not develop, you create the opportunity for malware; and once you have malware you have OS vendors trying to lock it away from you.

FWIW the biggest hoop an ordinary user would ever have to jump through on macOS is right-clicking an unsigned app bundle on first use to authorize it; which I've had to do on plenty of FOSS apps. This is not a problem, IMO - it is reasonable for the system to say "hey wait a second this app is new or different" and give you the opportunity to check twice before you jump. Code signing and notarization are things you only ever need to worry about if you want to distribute apps and not make your users right-click. Windows has similar problems with SmartScreen, except you can't even buy a code signing certificate to reliably make the problem go away, and Microsoft does an arguably better job at hiding the "no seriously this is just a FOSS app that hasn't jumped through all your hoops yet" option than macOS does.


The problem with how Apple is treating FOSS apps is in that it’s using scary messages like „this could be a dangerous malware!” every time, all the time, if you want to distribute your software without paying Apple for what amounts to protection racket.

Which leads to two outcomes: either you learn to right click everything and one day it’s going to be malware, too, or you „spctl --master-disable” and make the system turn a blind eye, and a drive-by zero-click kind of malware finds its way in.

Or, if you are not really familiar with intricacies of software distribution in Apple’s ecosystem, there’s a third outcome: you believe Apple and for you, all FOSS is dangerous malware.

Mind you, UAC nagging on Windows has the same problem, and sudo conditions you into just giving your computer password away left and right. I don’t know if a good solution exists, except maybe that you have to learn to not let your guard down in certain situations.


> You have an increased number of hoops to jump through if you want your computer to be programmable.

> At first, it was Gatekeeper. Yeah, appeared in 10.8. Then notarization. Now, on M1 you need to sign your binaries ad-hoc or they won't run.

You are confusing the local software development with the global software distribution.

Anyone is free to install development tools, compilers and toolchains, compile and run anything locally on an OS X, just like on any other UNIX system, including Linux. Gatekeeper and the notarisation enforcement, which are a default but can be neutered, address a common use case of «my mom/little sister/grandfather/etc has downloaded and installed a seemingly useful app from a dodgy website that has excreted a ransomware that now demands a payment to give their data back, or leaked all of their contacts/personal data, or slowed their computing device 100 fold down». OS X does give lay users a way to turn checks off, including the system integrity protection, and shoot themselves in the foot if that is what they want.

In fact, I would like to be able to selectively apply an even more, higher, restrictive security profile to any app, including to the ones available through Apple's own app store. Who can tell what potential spyware / trackware a 500+ Mb WhatsApp download from the official Apple app store contains within? WhatsApp desktop client is a mediocre (feature wise) messenger that is not even as feature rich as the 85Mb Telegram desktop download is. And I, for one, don't have the capacity to reverse engineer and manually inspect every single WhatsApp download/update for my own peace of mind and sanity, and I would also presume that very few people even from around here can. Anything coming out of FaceBook nowadays should be considered trackware and needs to be airgapped with the utmost diligence.

Even the business/enterprise software is ridden with all sorts of trackers. Citrix Workspace client, for example, installs three "telemetry" helper processes that phone home non-stop.

> Custom kernel extensions? No way.

Yes way. They just have to be signed for same reasons as third party apps. See https://support.apple.com/en-au/guide/security/sec8e454101b/... for details. I was recently scrubbing the file system off some legacy components and noticed, with a lot bewilderment, what extra 'useful' USB drivers a 4G MiFi device I had purchased over a decade ago installed at the 1st connection attempt. I, for one, don't need those 'niceties' to sneakily creep into any of my file systems or lurk around.

In fact, OS X has become more GNU Hurd since 10.15 like by pushing kernel extensions into the user space instead, which reduces chances of the kernel crashing due to a bug in a random extension. More kernel stability and security? I say yes to both, albeit your mileage may vary.


The thing is, you cannot harden it even more. If Apple Almighty hasn’t thought about your use case, it may as well not exist. There are two modes: either Apple nannying your systems like the overly attached yiddishe mame, or a mad ex breaking up with you, but also ripping out the locks from your doors before they go.

Which is my freaking point exactly: you are being constantly reminded that despite having paid mad bucks for the hardware, Apple still owns it, snooping over your shoulder, giving you slaps on the hand if you dare „misuse” what they gave you.


> The thing is, you cannot harden it even more.

Yes, you can. You seem to have never used Trusted Solaris implementing the B1 TCSEC level and running on the certified hardware where all system objects are labelled and MAC is enforced at the kernel level. That is an example of a much more locked down system, which you are conjuring up OS X to be but the OS X is nowhere near to being that locked down nor controlled.

The real trouble with the system hardening, however, arises at the network level where the traffic is fully encrypted, and it is impossible to tell whether a network connection, say, WhatsApp has established to 'external.fbsb1-1.fna.fbcdn.net' or to 'scontent.fams1-2.fna.fbcdn.net' is a legit one, or it is used for tracking the user. It is impossible to harden that further and is a unfortunate side effect of the use of the encryption.

App sandbox profiles can still be hardened further by anyone, not just by Apple at a whim; see https://reverse.put.as/wp-content/uploads/2011/09/Apple-Sand... and https://paolozaino.wordpress.com/2015/10/20/maximum-security... for details.

Otherwise, since OS X is a UNIX implementation, therefore it can be scrubbed off of undesired features, if a need be:

- SIP can be disabled.

- Gatekeeper can inactivated peacefully (or via brute force by setting appropriate host entries to 0.0.0.0 in /etc/hosts).

- Deemed to be illegit or nuisance services can be scrubbed off via /System/Library/Frameworks/CoreServices.framework/Frameworks/LaunchServices.framework/Support/lsregister.

- The rest can be unloaded and disabled via the routine application of launchctl.

- Security policies can be amended via fiddling with spctl.

Yes, all of the above is tedious, and it will break all vertically integrated Apple services and official app store updates, but it will give one a sufficiently 'clean' OS X install. If that is not enough, one can install Linux, NetBSD or OpenBSD to run whatever they please.

> If Apple Almighty hasn’t thought about your use case, it may as well not exist. There are two modes: either Apple nannying your systems like the overly attached yiddishe mame, or a mad ex breaking up with you, but also ripping out the locks from your doors before they go.

> […] Apple still owns it, snooping over your shoulder, giving you slaps on the hand if you dare „misuse” what they gave you.

You are misrepresenting a unsubstantiated strong personal opinion or a vague, generic, hand waving for a collection of facts that do not support such an opinion, for none of that actually exists. Apple is certainly not:

- Baby sitting my own or my colleagues OS installs; nor

- Does it slap me on the wrist for running anything I want out of my own moderately wild zoo; nor

- Does it care about the network traffic that passes through.

And Apple certainly does not own the laptop I have purchased from them; you might have a valid case with iPhones, iPads or Apple TV's but personal computing devices produced by Apple are not an example of such an instance.

Lastly, what might be a bug or a misfeature for you is a feature or a conscious compromise (that one has willingly accepted) for some.


This is the point, yes. You can turn most of it off, but then even regular software upgrades break before you even do any modifications. Running an outdated system is not viable in this day and age.

Part of the reasons for locking down is 'security' (though a lot of it is so easy to bypass through Apple's own backdoors that it seems to be more form than function). Which makes some sense from the user's point of view. Part of the reasons is DRM for Apple's services. Apple TV, Music, App store, iOS apps. I don't care about that but there is not much I can do about it on their hardware. Soon we'll have a new category Apple will want to lock down the system for: the CSAM detection. Of course Apple will be hell-bent on preventing people trying to tamper with that. It will come with its own host of protections, which I bet will even be active when SIP is off.

Right now there are already a bunch of folders which the root user can't even read. Try running a "find / 1>/dev/null" as root and you'll get a nice list of folders Apple considers too sensitive for the user even to look at.

The problem is not security. The problem is Apple enforces security without the user being able to override their choices. There is no way to make modifications to system files and "bless" them so it will still boot after the next update. You have to do it every time. There's no UI for such features at all. Apple forgets that I'm the owner and admin of my system and doesn't give me the keys to manage it.

It's fine for their new targeted audience which is mainstream users with a large disposable income. It's not great for power users with a unix background like myself. I lament that macOS has moved away from supporting my usecase because I did enjoy it for many years.

But now I'm happier with FreeBSD with KDE. I really enjoy being able to trace a problem right down to its cause again. I love all the customisation options again.


> At first, it was Gatekeeper. Yeah, appeared in 10.8. Then notarization. Now, on M1 you need to sign your binaries ad-hoc or they won't run. Custom kernel extensions? No way.

I don’t know, from my experience just building and running works fine. As hoc signing is something the tool chain does. I don’t write kernel extensions so I cannot really comment on those (though I have a couple of them installed as a user and there was no significant hassle). The worst annoyance I’ve seen is having to authenticate every now and then to attach a debugger to a process.

> there's some sacred knowledge that you should be in possession of, and update it every year for $99, so that you can program your computer.

Programming your own computer has no red tape, the difficulties start when you want to distribute binaries, if you don’t want your users to have to right-click once. You can get compilers and things from xCode, Homebrew or MacPorts and they work without you having to pay anything.

> I like the pure Unix approach more, when the line between using the computer and programming it doesn't really exist, and where the system is basically your IDE too, and where you're going from being a user to being a programmer and back without really noticing. Mind you, it doesn't mean you have to go from one to the other, but when you want/need to, it's damn frictionless, and the system is as malleable as you want it to be.

Yes, it’s nice. But my ose of MacOS and Linux are not very different in that regard (true, I spent quite a bit of time customising xfce, which was fun). Also, to be a bit realistic it does not really work with end users in general. For a mass market OS, the fewer footguns the better.


For NeXT the UNIX compatibility was a means to compete in the workstation market, nothing more.

"Why We Have to Make UNIX Invisible."

https://www.usenix.org/blog/vault-steve-jobs-keynotes-1987-u...


I stopped using Apple products when they started spying on everything I did. Do I really want Apple recording the name and hash of every program I run? Why ask permission and click "agree" to write code for my own use? Checked right out of that ecosystem and I won't be going back.

windows11 with wsl2. sometimes i forget i'm not in kde.

You probably thought the original iPhone was pretty stupid, too, I would guess.

Didn’t everyone recognize the first iPhone as a revolution? I mean, the entire Android team took a day off, knowing they had failed.

No, there was widespread derision. The idea that people would accept a phone without a physical keyboard was nothing short of heretical in business circles.

Steve Ballmer famously went on one of the popular morning TV news shows and laughed at the iPhone. The fact that he still had a job when he got back to Redmond explains a lot about Microsoft's stagnation under his leadership, and its subsequent return to a successful path once he was gone.


My memory is that MS laughed because they did not believe the hype. The laughing stopped when they got the first iPhones in house and were able to see how much space Apple was able to dedicate to the battery.

Not really. I had a fairly recent Treo at the time. I certainly didn't buy an iPhone when it first came out. Come the 3GS I was definitely ready to go with Apple but it wasn't an instant switch.

Of course, I was also not an Apple customer at the time except for an iPod sometime around that time.


I was also a die hard Treo user at the time but as soon as I saw the iPhone it was obvious this was the future and I got in line on release day to get one and never looked back.

When the original iPhone came out, it couldn’t run apps and didn’t have GPS. Capabilities that my Blackberry and even feature phones had.

The iPhone wasn’t really good until the iPhone 4.


My first iPhone was the 3GS and I was wowed by it (having used internet connected pagers and then phones since ‘97 (if you count SMTP with my various procmail filters as internet connected) and I was totally blown away. My memory of watching a Lego pirates YouTube movie in my living room with nary a cord or keyboard still has a bit of awe attached to it.

Don’t get me wrong. I bought an iPod Touch the first gen as soon as iOS 2 was released in 2008. I also bought an iPod Touch 4th gen when I was trying to avoid AT&T. I finally bought an iPhone 4 on Verizon.

The original iPhone /was/ stupid until they relented and decided to allow 3rd party apps with the iOS 2 update.

You say that but wasn't the original vision the one that was adopted by google since? Everything in browser?

I didn't say it wasn't. I'm pointing out the parallel that some people had the vision to see that the iphone would _someday_ be game changing. Others felt smugly superior because they were blind to a future that was obviously very possible in retrospect.

The radical bit is they deliver holistic software and hardware that actually works.

No other vendor comes close.


I find the Google Pixel line fits that description pretty well, but very differently. I prefer the software over iOS, and hardware is good enough, although not nearly on the same level as an iPhone.

> Whatever will redefine the industry will probably be laughed at and only adopted by nerds for a while.

I suspect that virtual reality and augmented reality are in this classic position. Laughed at, but totally adored by a group of nerds. As someone in that group of nerds, the feeling is so much like the early days of the web -- it really is incredibly exciting. It amazes me that anyone can be pessimistic about it, especially given the astounding progress and investment being made. In my view, it's deeply obvious that augmented reality is the next paradigm shift we're all waiting for.


Even if the GP claim was true, it does not follow that every thing that is being "laughed at and only adopted by nerds" is the next big thing to redefine the industry.

Augmented reality has some nice properties in a number of contexts. In others, it's totally irrelevant. Whether someone will come out with a product that will make it irreplaceable is a different matter.

It's like touch keyboards. Sure, they are better than phone keypad-style keyboards for entering text, but actual physical input still performs better. They are a different paradigm that allows device manufacturers to provide a more often desired feature (larger screens), but while they are winning over in places where they do not fit (like in-car infotainment systems), almost everybody prefers physical knobs for a number of features (like climate control). This will likely lead to a reversal of the trend.


I didn't say that every thing being laughed at is the next big thing to redefine the industry. I only predicted that augmented reality really is one of those things. Further, the person I was replying to claimed that the next big thing to redefine the industry is something that will be laughed at. Neither party claimed that everything being laughed at will redefine the industry.

The statement that augmented reality is "like touch keyboards" is a statement that you haven't given justification for. Augmented reality is not a technology centered around replacing physical input with non-physical input, and so the analogy is not self-evident. You might potentially have a good point, but you'd have to explain how this analogy is relevant to the technology of placing graphics and information at any position in a person's 3D visual space. Remember, most augmented reality objects don't need to be controlled at all to be useful, and you can still control an augmented reality screen with a mouse and keyboard.

I'm interested to hear more detail to your thoughts on why you think augmented reality won't be a paradigm shift. I'm also happy to elaborate on why I think it will, if you like.


The biggest issue with AR imo (or rather, why it doesn't appeal to me as a universal paradigm) is that it requires a device in your field of view to "augment" the reality.

Sure, a HUD in a car with AR is nice. A pair of binoculars or a telescope with AR is nice. Even a camera works. As long as the device is already "there", AR is a nice improvement (look through a telescope and it identifies stars or constellations for you) — provided it can be turned off for when you want to enjoy simple reality.

But otherwise, it's an unnecessary gimmick that you won't bother to use, simply because it's not universal (you won't get a pair of AR sunglasses for the day, and see-through glasses for the night, esp if you are not wearing glasses otherwise).

With a lack of universality, I don't see it as a "paradigm shift".

And there are orthogonal concerns like privacy. As the processing speed improves, privacy concerns will diminish (as you can have in-device processing), but just like with high resolution screens, this requires a lot of processing (visual data takes a lot of bandwidth, and while it's quadratic growth for 2D data, it's even more for stereoscopic or 3D data), that we are progressing towards very slowly.


The M1 iPad Air is actually $599 for the entry-level 64 GB, which nowadays isn’t that much storage. If you go for the 256 GB (there’s no 128 GB, for profit-maximization reasons) and add the Pencil plus maybe the missing charger, then you’re already at over $900.

What is the average person actually storing on their iPad, though? Photos taken with the iPad, maybe?

I think for most people it's just an expensive, nicer Chromebook. Everything they want to consume needs an internet connection anyway.

If you're a creative or just a nerd, then sure, you'll need to spend more money to get the specs you need.


Or photos/videos taken with the iPhone. The iPhone now starts at 128 GB, so it’s surprising they still start the iPad Air at 64 GB.

Downloaded videos for long flights! You have to get a storage bump to actually get a variety of content downloaded.

Also local copies of cloud storage are very valuable to keep on iPad.


One thing I've really appreciated on flights is the addition of streaming videos to your own device. If it saves fuel and maintenance costs, I'm perfectly fine with them ripping all the personal TVs out of the plane as long as they can keep a selection of movies available in case I forget to download my own and don't feel like reading.

I think you missed the “average person” bit.

The average user does know what local copies are.


Why wouldn’t they? Most of the popular streaming apps have download functionality.

Because they are busy talking to their mice.

Might want to check your dosage on the "omg no bundled charger omg omg the world is ending" memedrugs.

The iPad Air ships with a USB-C charger. A nice one.


Thanks for the heads up, can’t edit the comment anymore.

Is it less hot when writing?

I have the original pro and I won't buy another iPad again until it doesn't get so hot when taking notes.


I’ve had multiple iPads (though, admittedly, not the original Pro) and none of them has had a perceptible change in temperature when writing.

They've pulled off major hardware transitions twice before without hiccups. But this time they make the whole platform which is quite impressive imo.

Gosh I had to think back a moment to remember 68k to PPC. I wonder if that transition could be considered “botched” in that it happened at instead of going directly to x86. Outside hindsight I recall it was considered a questionable choice at the time.

It wasn't really though. Motorola was going nowhere. As to why not x86, that's another story but Intel has gone down the wrong path several times. Like with the Pentium 4.

Not really, PPC was a reasonable choice for a high-performance architecture back then, and arguably a better fit for former 68k coders than x86. And a move was necessary because the 68k was becoming a dead platform by then.

My recollection is that the PPC at launch was much faster than x86. Jobs talked about the road map a bit, and there was a lot of press about it too, but the road map didn’t pan out, and their partners dropped the ball. And many other companies made the transition to x86 (Data General was one I worked with) and subsequently died.

The biggest problem was Motorola/Freescale saw that their future was in embedded, not the (relative) high performance required for personal computing. So the chip provider for Apple's mobile lineup no longer was producing performant chips. Unfortunately, IBM's implementation of PPC led a dual life as a server/workstation and desktop chip meant that getting it's power consumption, heat profile and performance optimized for a mobile device was an extremely difficult proposition.

It would be interesting to see where we'd be if IBM had ensured Apple they could deliver a G5 laptop and had done so at a price and spec competitive with, or superior to, Intel.


> The biggest problem was Motorola/Freescale saw that their future was in embedded, not the (relative) high performance required for personal computing.

Ironically that's also the niche where PowerPC ended up when Apple dumped them :)


Yeah. The 601 was on par with the P54s and the 604/604e were a bit ahead of the Pentium Pro and Pentium II of the era. The G4 vs P3 and P4 is when the problems for Motorola and IBM eventually ran out of steam with the G5.

The striking thing about this time is that it is essentially transparent to most people. Probably more third party programs have been broken by security changes that Apple has made over the past five years than have been broken by the M1 transition. Yes, there are performance implications but M1 is sufficiently fast and most important performance sensitive programs are being quickly ported that it doesn't matter that much. (And, for most people, ultimate performance is mostly not a big deal on laptops these days.)

Transitive's tech, combined with processors that can afford some inefficiency, is pretty much magic to anyone who remembers what ISA transitions used to look like. (As a hardware product manager, I lived through a couple of them including a Motorola 88K to x86 one on Unix.)


I feel like more software died during the 32->64 bit transition than any other time in recent history. Lifting the rug is also par for the course for Apple, even artificially so with their latest crackdown on apps that don’t run well.

That mostly seemed to be video games (at least their developers complained the most), but I just had to wonder why they couldn't make their games 64-bit-ready with 15 years of warning.

Btw, 32-bit Windows binaries still run through WINE, just not 32-bit Mac binaries.


Games just don’t have a long enough tail to make this kind of effort worth it. They aren’t like business software where you try to keep people on the hook for years.

There’s little reason to keep updating a game past its first few post release patches. So you can assume most of their code bases have been untouched for years, were probably written by people who are no longer around, and haven’t been compilable by modern toolchains for a long time.

The tech debt is just insurmountable most of the time.


> but I just had to wonder why they couldn't make their games 64-bit-ready with 15 years of warning.

First off it wasn't just games. You can't run the old Microsoft Word anymore, for example. Why is this important? The old Word did everything I wanted. The new Word has fewer features than the old one. Also, I already owned the old one. Now I have to buy it again even though the old one was just fine.

For the games, I want to play my 20 year old games. Now I can't unless I fire up an old computer that is not upgraded and not on the internet (because it's horribly insecure). The companies that made those games don't exist anymore. And even if they did, it wouldn't be worth it for them to compile them again.

And on the iPhone I had a ton of awesome kids games. Those companies are also out of business, so again, I can't have the kids play those games unless I use an old insecure phone.

This is one area where Windows shines. For the most part you can still run 40 year old software on Windows.


The 68K to PPC transition was pretty good. But the operating system was running emulated 68K code for five years.

That was more or less intentional. The 68K interpreter was quite fast and 68K asm is smaller in memory than PPC asm, so converting all the cold code over would've made the system slower.

I'd like to see the same idea used today. One I can think of is C++ exception unwinding which uses a DWARF interpreter instead of generating tons of code to do it.


You’re looking through rose colored glasses. My PowerMac 6100/60 ran 68K apps slower than my LCII with a 68030-40Mhz accelerator. The original 1st gen 68K emulator was really bad.

Connectix became famous for shipping SpeedDoubler, a much better emulator. But my 6100/60 could still barely keep up with the 68030-40Mhz.

The emulator performed even worse on the PPC 603.


That could be, I think it was explained to me by an former engineer on Mac OS 8 or so. But there was still 68K asm in there up until Carbon.

It’s always interesting to me how Apple gets so much praise for taking things away, almost always needlessly with some made up excuse meant to sell more of something, only to later bring them back as if they’re some oracle of utility. Big surprise Apple, non-arbitrarily thin computers, usable keyboards, and ports are useful.

The M1 chips are nice. But Apple also continually throws developers under the bus forcing them along their deprecation strategies.


I used to feel like you do but nowadays I applaud that (while still not really using Apple products much due to the walled garden ecosystem) because it's really hard to take something away when you know it's good but could be better. For all the things that they've "taken away" over the last 10 years, it feels like the newer laptops are significantly better than they could have been if they just kept on adding and making incremental improvements like most other manufacturers.

No kidding. I bought a reasonably high end Thinkpad for myself and the entry level M1 MacBook Air for my son to do his schoolwork on. The performance, battery life and screen are much better on the M1 despite being significantly less expensive.

Hard disagree. At least from a business perspective. The transition to Apple silicon was likely more about long-term cost savings and their ruthless vertical integration. Apple appeals to customers (most customers, HN users are outliers) as a luxury brand and their market dominance is based on their brand more than their products. The products need to look distinct with high quality finish. Like Louis Vuitton bags or Polo shirts. Otherwise they'd be back to occupying the niche of the laptop for graphic designers.

One thing that is particularly under-appreciated is that Apple have pulled off a textbook disruption play against Intel; its supposed to be a small scrappy new company that disrupts the slow behemoth incumbent but in this case one of the biggest companies in the world has just executed it perfectly.

In this case of course having multiple products to graduate their own silicon through, and enough cash to design and build a chip, require you to be a huge company. But it shows strategic excellence and a long-term planning horizon to pull this off.

(Note I’m using the original strict definition of Disruption from Christensen, not the hand-wavey TechCrunch slogan.)


was intel really "disrupted"? The M1 designs aren't available as general CPUs for alternative uses other than on a mac.

They've taken a hit to their revenue by losing those sales. There's also the reputation damage of losing a prestige customer like Apple.

I work for a Fortune 500 company. About 70% of the employees has a MacBook Pro. Until recently, all of them had an intel chip inside. Going forward they will have an M1. We are on a 3 year refresh cycle. So within 3 years the majority of company computers will be running M1. About 90% of company phones are iOS. If Apple starts using M1 in those…

iPhones have been running Apple silicon ARM for a decade. The M1 chip is quite similar to the chip used in the 2020 iPad Pro. It's done.

I also work for a Fortune 500 company. Unless the employees are doing anything related to Apple ecosystem, they will be carrying Thinkpads instead, or using classical PC desktops, with external contractors having virtual workstations via the various cloud offerings.

Overall Apple desktop market across the world, is still around 12%.


Yeah. At tech companies and tech conferences, you get a pretty distorted view of the sorts of computers most people are using. Especially taking into account the fact that you probably also see a disproportionate number of people running Linux on Thinkpads in a lot of places, one might assume that Windows is barely used by looking at laptops at the typical event I attend.

I don’t work for a tech company. I work for a major sporting goods company.

I'm sure there are exceptions. Nonetheless, something like 80% of the PC market overall is Windows.

it's not really windows marketshare that matters - it's the instruction set. x86 compatible instruction set is dominant today.

If apple really want to disrupt intel (and i guess by collateral damage, AMD), they will release the m1 CPU as a commodity. but they will also have to figure out how to get microsoft to play ball as well (which i am not sure they will).


Apple is basically already using M1 in iOS devices. The A14 is basically to an M1 to what an M1 is to an M1 Pro.

I remember hearing that Steve Jobs had originally asked intel to develop a CPU for the iPhone (Apple had a close relationship with intel back in the days when they switched from PowerPC.) I remember intel being on stage at Apple keynotes and Apple also got "first dibs" on a lot of new intel CPUs back then. But intel dind't believe it would be profitable enough for them to pursue.

Apple had been screwed before back when IBM wouldn't make them a power efficient G5 chip for laptops. Then intel wouldn't make them a power efficient phone CPU. So here we are today, Apple does it themselves. Had intel made a phone CPU for the iPhone, the M1 might not have ever existed.


Yeah, Intel had the opportunity to avoid this outcome, and they fumbled on the mobile sector. Ben Thompson gives a deep analysis of the history here: https://stratechery.com/2021/intel-problems/

With the absurd efficiency gains Apple got, I imagine most premium Windows/Linux laptops will run ARM in 5 years.

For that, they'd first have to sell some. Outside of Macbooks, I'm not aware of a single premium Linux/Windows Laptop that runs on ARM. Until HP, Dell and Lenovo offer those, there won't be any uptick in ARM laptops outside of Macbooks. And most companies won't buy the first model running an ARM processor, they'll first want to see how driver support and virtualization of x86 apps (of which there are a lot in larger companies) work in reality.

The vast bulk of Windows laptops that go out the door go to companies that want them to be as boring as possible. This is probably the primary reason Linux desktops never made mainstream headway. Pretty much anything that increases support costs is largely a non-starter.

> was intel really "disrupted"?

Yes, Intel was disrupted. It was just disrupted by ARM. Originally the chips are too slow for normal use, but find a niche in low power devices. Over time the rough edges are smoothed out, and the underdog can make inroads in the high end marked while the incumbent isn't structured to compete in the low end very well, and their attempts fail.

> M1 designs aren't available as general CPUs for alternative uses other than on a mac.

M1 is just the capstone of that long process. This is sort of a wrinkle, but Apple's strategy means they can build a high margin product ("We don't strive to make the most popular product, we strive to make the best.") and not have to hand them over to by suppliers. Given the high margins M1 chip command when placed in Macs, it doesn't seem likely to pressure Intel

But make no mistake, Intel's margins are strongest on servers. Prices on Xeons are like 10x that for Cores. This is where the disruption is happening. Running macOS on M1 in AWS is neat but is probably for running build and test farms; Graviton is presumably the ARM chipset AWS customers might ditch Intel for. I've met teams that saved substantial money by switching, and that has to feed into demand for Xeons at some point.

The typical way a firm might survive an "attack from the bottom" is to chase after the high value compute. In 2022 that's AI / Tensorflow, where Nvidia rules and frankly, Intel underperforms. Hopefully Nvidia pays Mark Harris well because they probably owe him a few billion.


> Intel's margins are strongest on servers

Not anymore.

Intel CCG is at 9.3b revenue/2.8b profit this quarter (30% operational margin). Intel DCG is at 6b revenue/1.7b profit this quarter (28% operational margin).


Intel was "disrupted" by their own doing when they missed the bus on the mobile revolution and the GPU revolution. And the M1 is very much a niche thing. It's an SoC, not a CPU, which is what Intel mainly produces, so comparisons of M1 and Intel processors are bound to be somewhat flawed.

Definitely not the case that the full lifecycle of disruption is complete since that entails the full replacement of the incumbent. But with M1 Apple silicon surpassed Intel in performance after many years of inferior performance improving at a greater rate, which I think is one of the key milestones in a disruption story.

I’m not sure if there are any case studies of one behemoth fully disrupting another? So who knows what the end state will look like.

One confounder for predicting the endgame is that due to their vertical integration with their OS, Apple won’t replace Intel where Windows is a hard requirement, and so they probably won’t completely decimate Intel. I suppose in this aspect it’s not a full disruption.

I’m not really clear how much inertia there is behind Windows these days, now that MS is committed to Office on Mac. Definitely substantial, but if the price-per-performance was 2x or better for macs in a couple generations, what % of corporate buyers would switch over?


Apple has pulled off several big hardware transitions: 68k -> PPC -> x86/x64 -> ARM. One reason those transitions are considered successful is that they did a masterful job of managing expectations. Apple showed that the consumer hardware market really doesn't care enough about backward compatibility to affect the bottom line. And thanks to Moore's law, supposing you can legally acquire a ROM, your M1 Mac will happily run 68k and PPC code in addition to the officially supported x86/x64 emulation.

The lack of hiccups is mostly by training the user base to assume that things will just stop working eventually. It's part of the deal with apple; if you're a legacy software enthusiast Apple probably alienated you long before the M1.

> They pulled off a major hardware transition without big hiccups

This is easy to do when you eschew any and all commitment to backwards compatibility. Every major OS update has introduced breaking changes and I gotta be honest, everyone except the most diehard of Mac fans is getting pretty sick and tired of routinely re-purchasing software, not to mention Apple breaking their own applications. Example: obsoleting iTunes by force and coding hooks so it refuses to run without patching the app. (For those wondering, Apple coded a version check into the Mac version of iTunes so that it refuses to launch if the OS is "too new.") Ignore the fact that Apple Music is a shitty replacement that didn't replace all the features of iTunes (like removing the XML library sync, which a ton of third-party apps relied on, breaking them all in one move), but dont't let that stand in the way of progress.

Show me one person that has never been affected by Apple breaking apps, and I'll show you ten people with Avid or Pro Tools rigs who "never update the OS" as a general policy. It's 2022 and I'm still waiting for official EOL/support dates published by Apple. Everyone from Microsoft to FreeBSD does this. Saying "well they traditionally support N-2" or whatever doesn't cut it. People need hard dates so they can plan adequately. Apple's commitment in this area is like depending on your stoner friend to follow through on a promise.


I rarely see this sentiment since the entire Tech class has moved to Spotify subscription streaming as their music platform of choice (slight hyperbole, only slight). But as one of the 12 remaining iTunes app users out there, I am shocked at how terrible the software has become.

The 2005 version of iTunes that ran on my PowerPC macbook was strictly better than the app that I have installed today on my Windows 10 gaming PC. The 2008 iPhone iTunes app was better than the "Music" app on my iPhone today -- when I open my phone in airplane mode on a flight to listen to music, it bricks itself for 10 seconds trying to get wifi connection and prompt me to pay them for their crappy music streaming service. There is no way to disable that self-bricking advertisement.

I suspect that the average Apple employee uses Spotify to listen to music and doesn't have a large personal collection of songs not available for streaming on common platforms. The lack of dogfooding shows through.


Funny you mention that. I recently accepted their offer for a free 3 months, mainly out of exasperation of constantly seeing the interstitial every time I launched the app.

The UX of Apple Music is downright horrible. It proceeded to completely hose my existing music library. I could no longer access songs I had purchased in the iCloud Library, and it threw up a dialog stating I need to sync every Apple device I own with Apple Music to be able to stream music from their service. I was on a trip at the time, so good luck with that. Tons of embarrassing UI glitches, like widgets overlaid on top of others rendering them unclickable. Did Apple fire all their QA staff?


As another point of anecdata: I still use iTunes 10.6.3 on Mac OS 10.13 for this reason.

It's also modded with coloured icons. I still use an iPhone 4S with USB sync, and iPod 5.5G video.

The laptop (2014 Retina 15") has a 4 TB Sabrent SSD upgrade inside, using an M.2 adaptor. The iPod has a 128GB Kingspec SSD.

It's an intentional choice to lag behind, which will probably happen until the Digital Hub model makes a comeback. Privacy is easy to control when it syncs over USB.

I actually downgraded the library from iTunes 10.7 to 10.6.3 so that I might be able to use it on an OQO Model 2, Mac OS 10.5 Hackintosh, or PowerPC. For now though, I still just keep it going on the Retina: the beautiful screen, weight/ports balance, and repairability still make that the best model of Mac IMHO.

When Apple brings replaceable SSDs to the M1 though, I may well consider leaping forward. Ideally with a 3.5" phone again too.


Lack of dogfooding is part of it, but I have to wonder how many product managers they've brought in to convert their player to a cloud music service?

Changes like requiring you to sync all your devices with Apple Music are part of many changes to try and make the transition. It's a big one because a player is very different from a service (historically something Apple have been bad at), but they've had a lot of time. Apple have been able to get this far because going "all in" is accepted by Apple customers more than those on other platforms.


> Yes, I feel like this is somehow still massively underappreciated.

No, it is overhyped (not by Apple), and it worries me. Apple's platform jumps are impressive (and this last seamless platform switch is more about macOS, good decisions NeXT made a very long time ago, developer empowerment decisions), but let's not confuse that with how bonkers every user of M1 and family are going. We have a bit of anecdotal data coming in, no one can believe, and I quote 5000 new M1 owners, "how snappy" the M1 is.

But look at the benchmarks. Each M1 model is a typical and negligible increase in performance over whatever most recent previous Intel model. We are talking about 1.05x performance increases! For example, take the 2018 6-core Intel Mini with the 3.2GHz processors and compare performance to the 2020 M1 mini, and it is immediately apparent, the 2020 M1 mini is really only a little bit more performant. And this is not bad news, it just means everyone is out of their minds, but it is typical of new Apple releases. The new models are always just a little more performant than the last revision (except 2010-2012, when models doubled in performance 3 years in a row and did not double again until 2018).

So the hype on M1 is overwhelming, and the M1 and family are not at all under appreciated. People seem to think M1 is miraculous, and I admit I think it is pretty neat... even after comparing benchmarks with previous models and realizing this is not a quadrupling or even doubling of performance; the increase in performance with Apple switching from Intel to ARM is... merely incremental. This was a lateral move, and not a giant leap forward, not yet. Go look for yourself. But don't be sad about it, again, this is entirely typical of Apple new releases... the new models are always just a little better than the last revision. Of course, performance is not the only metric.

So the hype says M1 walks on water, but misses the truth, that M1 does what x86 does with less power. Again, M1 isn't what a lot of people are saying... it doesn't blow Intel out of the water, it merely keeps pace with Intel, and that is impressive enough... but add the efficiency gains, and what we have are chips as good as Intel's (more or less), but use less power. Anything evangelized beyond this is crazy hype.


MacBooks in particular went through a period with some notable downs--through some combination of design, engineering, and manufacturing missteps. Even my 2015 MacBook Pro had to get its display replaced (after Apple extended the warranty) to deal with a defect. But there was basically no MacBook between then and now that really tempted me to upgrade. (And the 14" M1 Pro is pretty much perfect for me.)

> the M1 era of Apple is the more exciting than things have been in years.

Abso-frickin-loutly. The 2020 MacBook Air M1 is the best laptop hardware device you can buy on the market right now. The battery life is amazing and this makes the laptop almost invisible since I am hardly ever struggling to search for power. The sound is great as well. Price per pound you cannot beat it.

My one small gripe is the black glass bezel, which turns into a distracting fingerprint magnet.

They do need to up their game on SSD storage. but I am sure the MBA's at Apple do not care because this drives people to buy iCloud storage. And if that is the case, they really need to work on iCloud because the syncing sucks.

I would certainly buy an iPad M1 if they let me run apps like LibreOffice, so they need to get their act together on software. Yeah, I have a lot of issues with their software. Software, IMHO, is where they really need to innovate.

Once Asahi Linux is stable I will probably abandon MacOS again.


Yet it still does not have a touchscreen, and I personally would prefer 2-in-1.

Sounds like you might prefer an iPad Pro with a keyboard.

I would instantly go for an iPad Pro, if it would run normal macOS. Or things like vscode and docker, and games. I just can’t justify to myself the expense compared to an m1 laptop, just for the form factor.

unfortunately the ipad os is the limitation on the pro. I tried to make it work but its back to its previous position as media consumption, music production box (which is annoying to deal with due to lack of audio outs), and occasional text editor.

Sure the iPad has its down sides. But it also does complicated things dummy easy. Example tossing a pixelated/blurred box onto a video. That's ridiculously complicated on windows and requires a hell of a steep learning curve. No problem if you have the time. But a blocker if time is and your video editing skillsets are short.

Your example is one enabled by the application, not by the platform. You can find easier video editors on windows or on android as well.

Sticking a keyboard on an iPad does not a laptop make. Being limited to mobile app versions of web browsers is itself a big enough quality-of-life downgrade to make the setup much less convenient than a laptop for leisure-time media consumption, not to mention professional work.

Sticking a touchscreen on a laptop does not a tablet make.

GP wants a laptop with a touchscreen, not a tablet. My laptop has a touchscreen. It's not the primary input method, but it is quite handy sometimes.

I'm not really interested in a touchscreen on a laptop, but I'd buy a Pencil immediately if it worked on my Mac screen.

In what way?

Indeed, lack of a touch screen would be a deal breaker for me. The iPad is attractive, my kids have them, and I might convince one of them to let me use it for a week this summer to see if it handles basic things like Jupyter notebooks and talking to homemade hardware gadgets.

I tried. The limited software and lack of desktop OS made it painful. I wouldn't try it again, personally. It felt like an exercise in compromise after compromise after compromise.

If you’re asking to use macOS with your fingers, you have not realized how terrible that would be. I do not mean in a desktop-OS is terrible for touch input. I mean in a macOS specifically is not built for fingers and would require so much work that Windows has been doing for a decade at this point.

It's not like Windows 10 is touch ready in any real sense, either. Windows 11 fixes some of the basic problems, but the gold standard for a desktop OS that's productively usable in tablet-only mode might ironically be GNOME on Linux.

What I do right now is typing with the keyboard and extensive use of the GUI with my fingers. So I'd still want at least a detachable keyboard, but also use it in full tablet mode. Among its uses, I'd read sheet music from it at band practice.

I have a Dell with a touchscreen. I never use it. The 16x9 ratio is the wrong ratio in either portrait or landscape.

I also have a Dell with a touchscreen, and at 6yo it desperately needs a CPU upgrade. I use touch quite often.

How does touch screen work on desktop? From the fact that nobody but apple has made a decent touch pad in.. 15 years?, I'm first assuming hardware wise it'd suck. But ok, let's assume that works. Doesn't a ton of desktop interfaces rely on hover, scroll, etc? For what purposes are touchscreen superior assuming you have a mouse/touch bar at hand?

I have a Lenovo Yoga 720 back from 2018. There's a bunch of input methods -- touchpad, touchscreen, pen on screen. The touchpad is better than a Mac (just as responsive, but gestures are customizable). The touchscreen is fantastic, prob more responsive than most Android phones. The pen works really well, though it is worse than the Apple pencil. Scrolling is done the same way on a phone. For hover -- most websites are built with mobile in mind, so hover is very rare.

The yoga line does 2-in-1s right. Try to check one out, you'll be pleasantly surprised.


That's great to hear, that's much better than I thought. Beating apple's touchpad i find hard to believe, but I'll make sure to try one out if I get the chance!

I did like the move towards less ports, although it was inconvenient at times. I do wonder if Apple had incentivized the ecosystem to move more to USB-C if things could have gone better. If there were lots of monitors and TVs actually supporting USB-C/Thunderbolt it would be nice, it's a thinner nicer cable, also has more bandwidth.

Let’s say it this way in another domain - no matter the rationalization, Mikey And Bob with Jerry is completely different than with John.

That’s all this is. Not that big of a deal, but without a doubt very different.


> Look at what happened to the MacBook Pro, losing most of its ports and the thinness causing them to put a much worse keyboard in it that caused massive problems. Sacrificing a bit of thinness and going back on those changes with the newest iteration has been much better.

Adding a useless touch bar and losing F-keys also doesn't do much to win over fans, and it should be stressed that the infamous MacBook pro keyboards were a constant source of problems.


I love the Touch Bar!

I don’t love the Touch Bar entirely, but I do really like the slider for brightness and volume.

I’d rather have fkeys for everything else. Maybe they could give us a mini touchbar just wide enough for going volume and brightness.


I thought I loved the slider for volume/brightness, and was concerned about losing them, but then realized how little I cared when I went back to no slider

I would love a full width Touch Bar right above the fn key row they just brought back. I don’t see why it has to be one or the other.

I’m surprised no one made an app that turns Touch Bar into fn (without having to press fn) and a button to switch to the apps choices.

Love Spotify with touchbar, debugging with vs code etc - shame it was hated.

A mini Touch Bar with fn keys above would be lovely!


> I’m surprised no one made an app that turns Touch Bar into fn (without having to press fn) and a button to switch to the apps choices.

That wasn't the issue. My main gripe with the touch bar is a lack of tactility; I don't need to look down at my hands to pinpoint the location of "F5" for debugging in VS Code, nor do I need to make sure my finger is hovering over the escape key before I press it. On top of that, capacitive touchscreens just don't make good buttons, my fingers frequently bump against the screen and trigger mutes and screenshots that simply wouldn't happen with a button. It's something of a usability nightmare.


> I’m surprised no one made an app that turns Touch Bar into fn (without having to press fn) and a button to switch to the apps choices.

This has been a setting in Preferences since a year or two after Touch Bar was introduced.


I think the touchpad would be the better location for the Touch "bar" rather than additional row

Brightness and volume are actually my two biggest touchbar annoyances… w/ older mbps I could simply feel my way to where I knew the keys were via muscle memory and adjust them with a few quick taps (or one long press) without looking or even having my eyes open. Near impossible with the touch bar.

Always good to remember other people can have different experiences, of course, so ymmv.


Volume slider is useless to me. I adjust volume with the scroll gesture on my mouse pad.

Coworkers always talk about how there are people out there who like them but I’ve never met one. Glad that you like it, drives me nuts.

Oh, I would love it too, it's a really cool idea—but not at the expense of the F-keys. It's like adding backup cameras in cars; it's a great idea—but not if it replaces all the old physical radio and HVAC controls.

Ah, but the DJ demo at WWDC using the touch bar.... /s

It is not uncommon for media to create controversy to sell the author's book...

You are not wrong. But in this case there was a significant amount of contemporaneous reporting when Ive left, predating this book.

It doesn't mean the investigation is completely false either, you just have to pinpoint the truth in between.

The funny thing is apple probably transitioned to their in house arm architecture in part because the intel chips ran too hot and throttled in the ultra thin ive products.

Agreed. I have looked at Apple products for years but couldn't make up the mind to switch, until my company gave me a M1 laptop and OMG it is so good. Last time I had this feeling was when iphone 3 or 4 came out.

My previous company also gave me a Macbook pro but that was 2017 and I found enough quirks not to buy one for myself.


Apple's hierarchal organization based on technical expertise is key to its innovation: https://hbr.org/2020/11/how-apple-is-organized-for-innovatio...

Everything else is a complement, but they don't drive leadership internally or externally to stay ahead of the industry.


Thinness matters when everything else is an absolute chonk.

But there’s a point at which it becomes basically worthless to get thinner.


Small nitpick the butterfly keyboard was problematic because of the high failure rate. Many, me included, actually like the feel better and the smaller key travel.

That was one of many reasons.

I would say the majority of people disliked the small key travel, that’s why apple “fixed” it in the latest iteration, specifically calling it out as a benefit.

What do you like more about it compared to the newer one?


In particular my main mechanical keyboard is a 35g electro capacitive one, so I'm been pretty adjusted to lower key weights. The new Macbook keyboards are just too stiff and have too much travel for me to type fast/not get fatigued.

I think even if one is accustomed to heavier keys, all things being equal, if they learned to type on lighter/shorter travel they can get faster at typing.


What keyboard? I've been using a low profile keychron with mt3 caps recently. Great feel with very short travel. Everything else feels so sloppy now.

Niz plum EC keyboard. The combination of 35g + the dome switches are a great combo of light activation but still letting you rest your fingers on the keys without triggering them because the activation is at the top rather than linear.

On the other hand, I use pretty light switches on a desktop keyboard and still was thumping all the way down on the butterfly keys.

I didn't hate how they felt in the act of typing, but I hated that they hurt my hands after a few hours. And (as you note) if you breathed near the laptop, they'd jam.


It's a big deal because a) they never fixed it and b) you have to replace the whole top half including the touchbar, etc. Mine died twice under 3 years of Apple Care and is going out again now. I don't like the short key travel, but that's such a tiny detail to me. The 4 USB-C ports are fine with me except the external video compatibility is such a flaky mess.

I'm stuck using this "cool design" as a desktop now because it costs ridiculous money to repair. And I get to flip my monitor on and off twice to get it to come back from sleep.

This is my 4th macbook pro. Previously, I had 1 battery problem in my first 2008 model and it was replaced at the Apple Store same day. My old macs ran until the batteries finally swelled years later. They weren't just sleek and cool, but super high quality and amazingly sturdy.

The other thing that stinks is that the issue wasn't something accountants did to save some bucks, but a design feature that cost me more.

I'm honestly only buying an M1 because I know that they've left the sexy-at-the-expense-of-the-customer approach.

I think Ivie sans Jobs got too focused on design and not customer experience. Apple made excellent hardware before Ivie, and likely will after. Just maybe not as many fashion shows.


I agree the reliability was a massive problem. I had to replace the keyboard (and thus logic board) for every one of those macbooks I owned until I started using a keyboard cover to prevent the issue (which Apple actually recommends against). No doubt after the 1 year warranty the keyboard would have failed again had I not upgraded on an annual basis.

With my machine of that era I just gave up and started using an old bluetooth keyboard I had around. Even when covered over apple care the third time my F key was showing the signs I just couldn't stomach the hassle.

No, it was problematic because most people hated the lack of depth, even if you didn't.

I didn't say most because I couldn't really prove "most". Do you have a source showing that most hated it because of the lack of depth? Most articles I saw were just talking about the reliability issues.

> Honestly to me the M1 era of Apple is the more exciting than things have been in years

The average person doesn't know or care about M1. If you are on HN, you are an enthusiast ("Pro" in Apple parlance). To everyone else, Apple just made their already pretty quiet and fast laptops, quieter and faster.

I think the article is right that the world is waiting to see if Apple's new bottom-up design org can deliver a new category-owning product. So far, they've proven that they can improve the legacy suite of products in meaningful ways and aren't afraid to roll back poor past decisions. I think the author is probably right that Apple's services org is getting much more attention than in the past.

When everything flows downwards, you get a singular vision, blind spots included. I think we saw that with Ive. This was true with Jobs' Macintosh too, before Ive joined. Today, we have fewer blind spots, but we haven't seen evidence that there are leaders willing to take big swings into new categories. Time will tell...


50% of customers purchasing a Mac in Q2 2022 were new Mac users: https://www.macrumors.com/2022/04/28/mac-users-q2-2022-new-t...

Which is actually incredible when you think about it. They might not know or care about what "M1" is (although I doubt that), but it's clearly a commercial success.


Great stat. I might be underselling the achievement... I bet some of that growth is driven by the insane battery life, which we know people care a lot about. To be clear, I own and love 2 M1-based machines :-)

I stand by my overall comment, though: They are better Macs. Not a new category or a new product for Apple.


They are powerful ARM laptops. I'm considering getting one purely because of that + Asahi Linux. My Pinebook Pro just doesn't cut it as a main machine, and I'm sick of x86.

X86 is bloated. Linux on Mac is a dark forest, though. You might be better off sticking with MacOS despite that it’s gone downhill lately.

I stick to free software whenever possible, so I wouldn't seriously consider using macOS. I even run Guix System (one of the FSF-approved distros) on my current main machine (ThinkPad T440p).

This figure is useless without context. What was this number for previous generations? I suspect it's always super high because a huge demographic for mac's is college students buying their first laptop, obviously it's gonna be their first Mac. Same with software devs. Tons of Macs are used by software devs getting their work computer.

All you have to do is tell someone - 20 hours of battery life and it doesn’t sound like a rocket ship under load.

As an tech professional and coder I can still say that barely 5% of my work is constrained by CPU. I'd trade all their hardware improvements for physical left and right click buttons instead of those awful trackpad gestures.

Have you used their trackpad, or are you coming from the exterior looking in.

One finger left click, two finger right click is sooooo useful and easy to get used to, it's borderline natural.


I've been using macs for dev for last 12 years at least. It's mandatory if you need to support iOS or safari users due to aforementioned vertical integration. Because building iOS apps requires an iOS device, and MacOS device and you still have to pay Apple a yearly fee. I had a one year hiatus where I had a Dell with physical mouse buttons and a touchscreen and it was marvelous.

yea I don't know what he's talking about. the haptic feedback is so good, I'd never guess there wasnt a physical switch. I had to power it down just to be sure. apple knows how to deliver a solid tactile experience.

I have never heard anyone complain about a MacBook trackpad before.

I have also never used a Windows Laptop the comes anywhere near the level of perfection of a MacBook trackpad.


They just released a brand new computer, the Mac Studio. This is not a legacy product.

Global PC shipments down 4% in Q1. Mac shipments up 8%. The M1 Macs are the most exciting thing Apple’s done since the Watch.

https://www.counterpointresearch.com/global-pc-shipments-q1-...


People as talented as Ive still need a champion and a leader to balance his artistic sense to constraints in business. Jobs was that champion. Without the counter balance and support from Jobs, Ive became less effective.

That could just be a cycle thing. In an age where computers were at best boring, Ive and Jobs were what was needed to create the next great products. But maybe we hit a technological wall, and now we need strict hardware improvements.

We all complain about the thinness, but it's not really that is it? It's the sacrifice they made to achieve it that pisses people off. Because we still need the things they threw away. If it all worked as imagined we'd be all over the slightly thinner machines. In some ways, the M1 is going to enable the thinness again.

When we hit the next tech wall, you may need another Ive and Jobs to dream up things.


Ive is just the latest vehicle for the oft retold “apple is doomed” story. The media frequently plays apple as some kind of small company where genius individuals are developing blockbuster products in isolation, and not the hard-slog reality of their product development methodology.

I can definitely see that argument, but I feel the issue is that Ive lacked a product-focussed and customer-focussed CEO to reign in his "wilder" impulses (the Edition watches are another example of that).

Jobs may have managed Ive's drive in a way that would not have seemed bean-countery to him but customer-focussed.

In other words, Ive was only a liability in so far as he had no actual peers at the company to reign him in. And it's debatable that that's not really his fault.


I would have liked to see what Ive could do with the M1 and its thermals.

I think that your comment is insightful in that while Ive's time at the top of Apple product development was far from perfect, his success was undeniable.

Jony might not always have been right but he was always wrong for the right reasons. Principled in his vision for design and human interface. I think his worst successes (lol) were those he had over software UI and removing information from the UI in the name of aesthetics or the Touch Bar in the name of changing how we see keyboards.

His understanding of how to make a computer something you wouldn't be embarrassed to be seen with but also something that you can take with you everywhere (thin and light are good when the tech can match) have certainly made his devices more a part of our lives.

Jobs could match him on this stuff but I would imagine that it would go over a lot of business and engineer types' heads. Takes all sorts.


This is the problem with designing and re-designing where function follows form.

You have to keep changing and the only way to go, seemingly, is away from functional.


I agree. I’m really happy to be in the M1 era. Healthy companies have a good balance between engineering, design, and marketing. The M1 is the direct result of some really talented engineers not the design team.

Ive is a brilliant designer, but Steve seemed to be able to reign in his worst instincts. From everything I have heard about Ive, he reminds me of a few designers I have worked with who were wonderful designers, but struggle to understand the importance of practical constraints such as engineering, cost, user feedback, etc.

The Apple Watch is a great example. It is a wonderful device, but the whole launch seemed confused. Remember the 24k Gold watch? That was the least Steve thing Apple has ever done. Steve believed in well designed and built products, but was almost allergic to excess.


That makes him a pretty poor designer, particularly in the industrial sense.

If you can’t balance those constraints you are not a good designer.


Ive reminds me of a pretty good designer elevated at the right place at the right time

If the article is to be believed, Ive had already stepped away, at least mentally, by late 2014. Which does mean that the much maligned 2016 Macbook Pro might not have been fully his fault. Tbh that seems a lot more likely than the usual HN "Jobs kept Ive in line" argument. Ive had been leading design at Apple for a long time. It's be a little odd if he didn't learn any sense of pragmatism from working with Jobs or from being the design leader.

Instead it makes a lot more sense if the 2016 MBP was a result of the other designers being let loose with no supervision from Ive. Lacking leadership, they went in on no ports and a gimmicky Touch Bar.


>Honestly to me the M1 era of Apple is the more exciting than things have been in years.

Not if you look at the style. It's another a boring grey slab. Jony Ive's job was making everything fashionable.

While computer geeks tend to under appreciate fashion, it's incredibly important. Fashion is the main difference between a cringe bluetooth headset and Airpods. Right now Apple's best offerings seem permanently stuck in 2010s fashion, and it's getting really tired looking.

Frankly, I'd rather be seen sporting a tablet thin Samsung galaxy laptop with punchy colors, stylus, and 360 hinge than a Macbook now. I don't care if the M1 has 50 hours of battery life. It looks boring. I don't want to look boring. I don't want to look like I'm permanently stuck in 2013.


> It looks boring. I don't want to look boring. I don't want to look like I'm permanently stuck in 2013.

Your consumer gadget choices should not define you or your "look".

Nobody else cares what computer you use. It's fine.


For you they don’t but for many if not most they do. To most consumers the products aesthetics is the only deciding factor between competing products

The M1 arrived 13 years after the iPhone. A company of the size of Apple not putting out new products in keeping with its size is basically a lack of innovation. Some of my friends, who are Apple employees (and basically fanboys) can't stand to hear this, because they think that Apple is the greatest company on the planet (it may be, in terms of market cap, but debatable on all other fronts).

But, it's pretty difficult to follow up the iPhone and iPad with something that's as revolutionary. There are people who think the watch is in the same category, but I doubt it. It does sound like Ive was burned out, which the article points out. Once you are burned out, you are really rationalizing, ipso facto, your decision to leave, which has already crystallized in your mind. The article does point out that Ive was burned out and could not manage the large group effectively because he was overwhelmed. In the end the Gods were cut down to size. It sounds entirely plausible that Ive did not produce much (aside from the campus, whose aesthetic and functional value is questionable) and made poor calls about the marketing of his first "independent" product, the watch (which was reoriented from fashion toward fitness by Cook), because he was burned out and bean-counters were running the show, sometimes rightly questioning extravagant designs in things such as the Apple campus.


Unpopular and potentially downvoted to oblivion opinion, but have to say this - this kind of reverences to Ive make me physically ill. He's still to be accounted -alongside Apple- as one of the culprits of today's trend of irrepairability (and the consequent planned obsolescence) on devices for the sake of 'minimalism' and Dieter Rams wannabe designs.

IMO, you're giving Ive too much credit for both shaping consumer preference and forcing competitors to follow his lead. For the most part, people like thin and light and care much less about upgradability and repair.

I agree but Ive was also a megaphone.

To what ends? I don't think people care too much about diminishing returns. At that point it is a matter of which upgradability and repair tradeoffs. Was there a recent line of products that they offered featuring upgradability and repair we can compare to?

Go back to about 2010 or so and MacBooks were reasonably repairable and upgradable by a fairly casual person. Batteries used to be routinely swappable in both phones and laptops.

The good ol iphone 3G days

Can we stop this silliness. Computers do not become obsolete due to them being difficult to repair.

They become obsolete because they become obsolete. Let's not just throw whatever we can find on the "right"-to-repair heap in an effort to desperately justify it. Justify it on its own terms. Or don't.


I have a 7 year old desktop that is not obsolete since I can keep replacing stuff in it and upgrading it.

When something breaks and you cannot repair it, it becomes obsolete in the most concrete sense: it's no longer usable for it's intended purpose.

Maybe your intended message is just unclear to me but your whole comment seems like pendantry without any substance.


None

No, but he's responsible for building a phone so thin it bends easily

The Samsung S5 wasn't compact, fast, or energy efficient compared to iPhones at the time?

What were Ive’s most notable accomplishment after Jobs died? The author suggests he would have produced amazing things if he had retained more power, but it’s not clear to me why we should think this.

Apple Park and Apple Watch. I worked on Apple Park and was in meetings with Ive. He was amazing.

Thanks. Apple Park of course wasn’t a consumer product, and Apple Watch has only been modestly successful (and isn’t particularly beautiful or useful in my opinion). I read half the article and didn’t see anything about the design of the Apple Watch being compromised. It mostly discussed how Apple did not follow Ive’s preferred marketing approach. of course, he could’ve been right that it would’ve been more popular if they had marketed it as he recommended.

What did you find impressive about him?


> Apple Watch has only been modestly successful

Oh come on, no need to be an edge-lord with this sort of comment. They’re utterly ubiquitous and the next closest competitor is probably Garmin. I never see Android watches outside of the Verizon store.


[...] and Apple Watch has only been modestly successful [...]

It is literally the most popular and most profitable watch (not just smart watch) in the whole world.


Apple's Wearables business is a ~$8 billion per quarter business. That includes AirPods and a few other products, but Apple Watch is obviously a major part of it. To me, $8 billion per quarter is far more than "modestly successful". Of course, comparing anything to the iPhone's success will make it look modest.

I think the wearables revenue is overwhelmingly AirPods? I can’t find clear data easily, but the first hit on google is $23B in revenue per year for just AirPods. It also looks like the number of Apple Watches sold is a tenth of the number of iPhones sold.

Like, the biggest contribution of Ive, the chief designer at Apple over a decade, was a product that augments 10% of phones (and of course costs much less than a phone). If you were hiring for design Czar, wouldn’t you expect more?


Not only has Apple Watch been hugely successful by any reasonable standard, it has wildly exceeded any standards of success that were even considered possible at the time of its introduction. And this is obvious. It has basically obsoleted the entire Swiss watch industry and has replaced it as a status symbol except for itinerant rich watch nerds who can afford Patek Philippe and Rolex collections. It has also created an entire new category of watches as a fitness and lifestyle device with capabilities that simply did not exist before.

And of course in the smartwatch market, Apple Watch created that market as it exists today and dominates said market. (Yes, I am aware that a tiny, nascent version of this market existed before. That's now irrelevant.)


What measure would have afforded Apple Watch a better rating than “modestly successful”?

Anyone who has worked in Apple Park (as opposed to on it) can comfortably state that the place is terrible and ridiculous.

There wasn’t enough room for the people Apple already employed at the time it was finished, much less now.

The collaborative spaces that eat up huge square footage are never used, because (1) that’s not how most people actually work, and (2) they’re right outside people’s offices, and conversations in those spaces are hugely disruptive.

If you’re fortunate enough to be afforded a private office, it’s a glass fishbowl. You constantly feel the need to watch over your shoulder.

Most people are in large shared desk spaces. They’re noisy, distracting, and frustrating to be in, and they’re also a fishbowl.

Sure, Apple Park has tons of high-end mass market designer furniture. Looking like a DWR showroom doesn’t make it a nice place to work.

I genuinely believe Apple Park has had a measurable negative impact on the quality and value of the work being produced by people there.


Does this also play a role in many Apple employees wanting to continue remote work?

Take Apple Park off your list. Steve Jobs went to the city council to present it

Downvote all you want.. https://www.youtube.com/watch?v=gtuz5OmOh_M The main designs were done already..

Apple Watch was an incredible achievement of hardware design given that it is essentially unchanged since launch.

The shape has been slightly tweaked and there are some new bands and sensors. But they got the basics of the design right the first time. It’s easy to overlook how rare and difficult that is.



Ive may have done great work under Steve Jobs, but his work since Jobs' passing has been disastrous.

Let's consider Jony's performance on software design first. This is what some prominent people have said about iOS 7: The Verge wrote in their review: "iOS 7 isn't harder to use, just less obvious. That's a momentous change: iOS used to be so obvious." Michael Heilemann, Interface Director at Squarespace wrote, "when I look at [iOS 7 beta] I see anti-patterns and basic mistakes that should have been caught on the whiteboard before anyone even began thinking about coding it." And famed blogger John Gruber said this about iOS 7: "my guess is that [Steve Jobs] would not have supported this direction."

And what about Jony's other responsibility, industrial design? The iPod, iPhone, iPad, MacBook Air and other Apple products from Jobs era are all amazingly well designed and breathtakingly beautiful. But these products weren't designed by Jony Ive all by himself. He designed them under Steve Jobs's guidance and direction. Steve was the tastemaker. Apple's post-Steve products are nowhere near as well-designed.

Consider iPhone 5c, for example. The colors were horrid, and when you added those Crocs-like cases it looked more like a Fisher-Price toy than like a device an executive would want to be seen holding. That the 5c didn't do well in the market shouldn't surprise anyone.

As an Apple shareholder and customer I am glad Ive is gone.


Personally, I found the 5c colors nice. I dont see much of a difference to those huge (ugly?) cases that people wrap around their phones today. I dont know many people who are using naked phones today and my feeling is that phones today are not designed anymore to be case-less: the camera bump, razor-sharp edges, too thin bodies.

The 5c flopped because of other reasons: it was artificially made worse than the 5s. It started with 8GB memory that was already WAY TOO LITTLE back then, it had no touchid, it had a bad camera and so much more - and yet, the price was high. Apple learned from the mistakes and changed their segmentation strategy and it worked since then (with the exception of the weird XR thing).


I had an iPhone XR from work. It flopped…but it was a good phone, and it was not necessarily an obviously bad idea, in my opinion.

People like big phones. The idea of a big iPhone that has contemporary guts but lacks some of the really wild camera abilities of the Max models didn’t inherently seem like the wrong plan to me. But it just kinda went nowhere, probably because you can just go buy last year’s iPhone model.


It flopped because apple did that: "here is our standard model.. and by the way, here is a cheaper, crappier model".

Now they have the approach to have a full price standard model - and there is a super duper "pro" model for the extra successful, professional and rich people.

This approach makes more sense for the consumers and this makes the 12 and 13 fly the way they do.


Calling this "disastrous" goes beyond hyperbole into a realm of parody, so I can only assume you're kidding.

A lot of the iOS 7 changes were positive, and those which were missteps were far from "disastrous" and have already been reversed.

If you have to go all the way back to a fringe product like the 5c, that kinda proves the opposite of your point, doesn't it? The 5c is the exception that proves the rule.

Nobody should be glad Ive is gone. It was a tremendous loss. For every decision you might disagree with, there were a hundred others that were spot-on.


I’m also very glad that Ive is gone and I think iOS 7 was a disaster that gave up an enormous UI/UX lead.

His value existed solely when his extremes were moderated by someone like Steve Jobs.

When handed the reigns, Ive ran roughshod over what made Apple — and his own work — worthwhile. He prioritized form over function, and ego over empathy.


The flat design introduced with iOS 7 absolutely has been disastrous for the obviousness of the UI.

> But these products weren't designed by Jony Ive all by himself.

This goes for iOS 7 and the iPhone 5c as well, no?


I think the OP is saying that they’re basically a refresh of a design that was conceived while Steve Jobs was alive.

Eg. The clean aluminum case and the black keys etc. It looks fundamentally the same.

Contrast with the butterfly keyboard, which was new in a post jobs world. And universally a disaster.


Not “universally”. I have a stock of 2016-18 era laptops because they are the only thing I can comfortably type on for extended periods. I prefer them to every laptop that came before and every one that has come since.

I entirely agree that Ive's design brilliance required Job's meticulous attention to details and taste to be really successful. He did make some questionable choices after Job's passing.

And the article doesn't make much sense to me. While I don't agree with everything Cook does, Recent Apple computers are traditional Apple at its best — extremely well designed, minimalist, functional, with best in class performance, displays and portability.


>Consider iPhone 5c, for example. The colors were horrid, and when you added those Crocs-like cases it looked more like a Fisher-Price toy than like a device an executive would want to be seen holding.

The iPhone 5c was not a device for executives.


Exactly. It was the device well-heeled parents might get for their teens, or twenty-somethings might buy for themselves.

I, for one, very much prefer the aesthetics introduced in iOS 7. The first version was rough and sometimes inconsistent, but that can be expected when the changes are so radical in such a big project.

Why do we ever only read stories about Ive, and not nearly as often about people who made software design decisions, who work on the OS or security, or who made strategic decisions such as Apple's walled garden?

The only technocrat who triumphed seems to be Ive.


Because Ive is a public figure and he was show cased during SJ era. I am not trying to diminish Ive's contributions but the software guys will never get the mainstream love as the design guys.

If stories about Software guys are written that would be tech media which is heavily dropping the ball, because they have some product to unpack and try it for a day and given it some useless score.


You must mean software vs. hardware, not software vs. design.

Software is designed. Hardware is designed. Design is not how something looks on the outside. Design is how something works.


I agree with you, but it's worth noting that shortly after Jobs' death, Scott Forstall was kicked out, and Jony Ive took over software UI design as well. The result was iOS 7 and its flat UI.

The bottom line is Apple products post-Ive have improved dramatically in terms of practicality, efficiency, and even design. Without Jobs, the guy should be without jobs.

HDMI ports and SD card readers ftw

I love not carrying around a dongle for my new macbook pro. I can’t believe they removed those in the first place. Its a pro model.

I'd really like to have a list of engineers I'd need to wine-and-dine to add two USB-A ports for the good of the people.

If that’s true, shouldn’t one of the underlying concerns - Apple’s shift in priority from hardware to service products - be even more disappointing? How many innovative and user friendly gadgets are we being deprived of so the company can focus on US-only credit card and content streaming services?

What a testament to ego.

Rather than the Apple watch being a Vogue-celebrated product for the 1%, it’s an attractive and high quality product for many people. I see it on the wrists fashion icons and on the wrists of people working at my local grocery store. Billionaires and ordinary people can have basically the same phone, watch, and AirPods. That is the true genius of Cook, reviving the “computer for the rest of us”.

I do think Apple’s design has become a little stale, but if there has to be a choice of once vs the other they are currently picking the right one. Let’s not forget that the vaunted focus on design has given us a mouse with the charger on the bottom as well as innumerable other botches over the years. I am glad Apple was willing to push the envelope, but some of that stubbornness has worked against them.


Stale is not a bad thing, design for design's sake is.

Apple had some pretty nice stuff before Ive, I see no reason it won't have amazing stuff in the future without him.


And the new MacBook Pros seem to indicate they’re steering out of that “thin and sleek above all else” whirlpool.

I wonder if the push for thinness helped drive Apple to produce their own CPUs?

Despite complaints about too thin iPhones, I think they've been getting thicker again since around iPhone 6.

I think this highlights how not black-and-white this is. Some of the focus on design are clearly what made Apple so popular and tech people can be dismissive of that.

The magic mouse is so much better than any other mouse I've used (because the scrolling is so much better) and the charge port being on the bottom has not been an issue a single time over multiple years.


I’m curious: how is the bottom charge port not an issue for you? What happens if your mouse runs out of battery while you’re working?

For me, in 6+ years of daily use this has only happened to me twice or so that I can recall. I do check the level from time to time so I can be sure to charge it overnight. But even if you forget the trick is that the battery charges surprisingly quickly from 0 to "enough to get through the rest of the day". Coffee break quickly, 10, maybe 15 minutes. So I do keyboard-only stuff for a bit and/or step away for a breather. It's been a non-issue in my personal experience.

> I do check the level from time to time so I can be sure to charge it overnight.

So here's the rub, isn't it? Consumer mice from decades ago did NOT require you to check battery levels.

There's no good reason for a charging/wired port on the bottom of the mouse. It may only be a problem rarely, but it could just not be a problem AT ALL.


You get a low-battery notice when there are only a few hours of charge left. For me, there's always been enough time to finish up my work and let it charge overnight.

The only time it was an ever an issue when I used one, was I plugged it in, I went to the bathroom, I got a coffee. And it was charged enough to use it until I could charge it for hours later.

The sleep cycle of a mouse microprocessor is 99.9%+. A few minutes charge will get you hours of use.


That's what I was paranoid about with wireless mice: what if the battery runs out while I'm in the middle of a competitive, time-sensitive game? With my Logitech mouse, I can plug it in from the front, so it's a minor inconvenience.

Of course, just thinking about using the Magic Mouse for FPS games hurts my wrists, so it in particular was never a real consideration anyhow :P


I get a battery notification when the power is running low. That means I usually have a couple of days of usage before it's completely out. So then I charge it while I go to lunch or overnight.

I never remember to plug it in after I’ve finished my work and am ready to leave for lunch or for the day.

Short of putting a post-it note on my screen, I don’t think I ever will. My brain doesn’t work that way, and the point of technology is to support me, not for me to support technology.

Requiring users to perform their own async mental scheduling of a trivially forgettable pending task like “plug in the mouse” is bad design.

Perhaps some people have an innate ability to perform that kind of task scheduling without it being a significant cognitive load, but many others do not.


I think you must be exaggerating a little bit, because if true then it doesn't really matter if the charging port is on the bottom. You need a corded mouse. Apple doesn't make any, but I've yet to see any 3rd party USB mouse that a Mac won't work fine with. I've got a Logitech connected to a USB-C dongle for the infrequent times that I want a mouse on my MBP. Mouse+Dongle is also cheaper than a Magic Mouse.

I’m not exaggerating at all. I can try to drill “remember to plug in the mouse” into my head, but it doesn’t matter. I’ll forget.

The previous magic mouse worked fine; if I ran out of juice, I just swapped the batteries immediately and kept going.

If the current Magic Mouse supported charging while in use, I’d just do that. Problem solved.

The turtle-mode charging is a ridiculous design constraint for those of us for whom “remember this trivial and stupid task to be performed at some arbitrary later time” does not come at all naturally.

My (ridiculous) solution is two magic mice. When one dies, I swap it for the other. No cognitive load, no breaking flow — but it’s silly to have to keep a spare $99 mouse around to solve this problem.

My employer has plenty of spare magic mice floating around, or I’d probably just buy myself a Microsoft mouse that uses AA batteries.


>I just swapped the batteries immediately

So even if I'm at home where I (almost) always have batteries, this still involves going downstairs, digging a couple batteries out, and swapping them. (In an office I probably wouldn't have batteries handy.)

I won't defend a rechargeable mouse you can't use while plugged in; the Logitech mouse I generally prefer lets me do this. But just swapping batteries isn't clearly better than can't use a mouse while it's charging to me. And with a laptop I'm actually fine with using the built-in trackpad 90% of the time.


I always had batteries at my desk. Either way, an instant fix is preferable to “remember to do this later, and if you forget, your mouse dies at a most inconvenient time”.

I don't think it's the design, it's the usability. I might argue that it has always more about usability than design.

Sure, the iconic designs of the G4 Cube, iMac, metal Powerbooks and later Macbook were what drew people in, but the usability is what kept many people with Apple.

Is the better scrolling a design or a tech thing? Probably tech, but I'm sure it's only that good because someone really understood how people use it and what it really needs to do well and then insisted that the tech worked flawlessly.


If I had to start from first principles and I had to either be the tech person who figured everything out or the person who insisted it work flawlessly, I'm pretty sure I know which would be the easier job.

Usability is a sub-discipline of user experience, which is a sub-discipline of... wait for it.. design.

Design, from designare - to mark out, to prepare the plans for, especially to plan the form and structure of

Usability is just one area of emphasis for design. Apple simply included too few elements of usability in their efforts.


Definitely. It's also mostly Steve Jobs'/Apple's definition. But in the context of discussing Apple, many often solely talk about the shiny exteriors and looks of Apple products while neglecting usability.

We may disagree, but the usability used to be and maybe is again the main selling (or at least "staying") point. I think this was lost a bit during Apple's "high design" years before Ives departure. It was often form over function.


For me, the charge port at the bottom is a clear case of design being more important than functionality. Of course, if you're the kind of person that always remembers to charge their mouse before the battery runs out, it's not an issue for you, but for the "rest of us", having the option to charge it and use it at the same time would sure be handy...

This topic comes up again and again, and what I’ve noticed is that folks who use the mouse often find that it lasts for months and months without recharging. Folks that like the mouse (myself included) agree that the frequency of charging is very low, while folks that don’t like the mouse will say that it’s unacceptable.

I use the wireless Apple keyboard and trackpad, which both have similar battery lives, but when the battery runs out it always somehow happens to be right before a meeting where I really need to plug it in and use it right now

(and these days when everything else is USB-C, instead I'm scrambling to find where my damn lightning cable has gone)

Although I get why they would not put the port on the back - their Lightning cables are not built for the repeated motion/strain of a plugged-in mouse. Anything more strain than plugging your phone in and setting it down causes the cables to break down in months.


The cables don’t last because of pretty but ineffective strain relief. Another failure squarely imputable to Jony Ive.

I can only recommend using BetterTouchTool for adding more guestures to the mouse - and it comes with a nice feature, it reminds me when the charge of the mouse goes below 25%. This gives me several days to recharge it before it actually gets low.

> I use the wireless Apple keyboard and trackpad, which both have similar battery lives,

?

The keyboard has 3-5x the battery life of the trackpad. Not sure how they can be similar to the mouse if they’re not even similar to each other.


Similar as in you're not recharging it on a daily or weekly basis, but only very rarely. A 3 or 6 month battery life doesn't make a big difference on that scale.

Some of us who use the mouse always want to keep it plugged in all the time because the extra cable on the desk is less important to us than (a) having it not occasionally glitch due to Bluetooth and/or (b) not wanting the extra burden of having to charge it. I guess scheduling the charging is minor if you don’t have ADHD, but it’s still one more stupid thing to take care of that really isn’t necessary.

I have ADHD, but it presents itself differently. For me, a more minimal desk free of the sight of cables (all the time) and free of annoyance of the tension of a cable or sound of a cable grazing against the desk (all the time) trumps plugging in a mouse twice a year when I stop working for the day. But I’m the type that can’t focus when I see a battery icon is low, and so I’ll just take care of it when I notice.

I, too, think that an extra battery isn’t worth it, but that there are people who want a mouse that is always plugged in doesn’t mean a wireless mice that can’t always be plugged in is badly designed.

It almost feels like the silly charging port was placed there on purpose. It tells users that you're not meant to use it plugged in, stop worrying about the battery, it'll be fine. And conversely it forces the engineers to build something that doesn't need to be charged constantly.

...I still prefer my Logitech mouse that works plugged in though.


In my experience it needed charging much more frequently than that, and often at completely unpredictable and inconvenient times. I resorted to keeping a spare plugged in at all times when I worked in an office.

People who complain about the charge port on the bottom don't use the mice.

It's a stupid location for a charge port but the battery lasts for several weeks if not months; it's just not that much of an issue in real world use.


> but the battery lasts for several weeks if not months; i

In my experience, this is exactly why it's an issue.

All my less tech savvy family who use this mouse forget to charge it because "it never needs charging" ... and then it runs out of juice whilst they are in the midst of something and now they can't use the mouse.


But MacOS warns you that the batteries are low several times before they run out.

macOS literally nags for days on end - easily an entire week - when the battery is running low before it actually runs out. It starts notifying when the battery level is 10%.

I have no idea how this is a problem for anyone.


The battery degrades.

My Magic Mouses battery only lasts for 5 to 10 minutes... if I could use it while plugged in I could still use it. As it is I can just toss it.


How long have you had it?

I don't know. I have several Magic mice, and I switched them between various computers often enough that I don't know which computer the broken one came with. The oldest one is probably from 2016.

I have several Magic Mice, including the one that still runs on the battery. I never had any of the batteries going bad - my oldest one is 5 years old at least still goes like a month on a charge.

>People who complain about the charge port on the bottom don't use the mice.

People who don't like pink bags don't buy pink bags.


With the example of the charger-on-the-bottom, where does the break down come from?

Was this never even considered an issue? As in, did it just never get considered as something bad? Did it never get focus grouped in a wide enough range of users for someone to question this? Was it noticed, questioned, decided it was works as intended?

It just seems like with a company the size of Apple who all work under draconian NDAs, why not have an internal employee focus group? Hubris of the design silos?


Have you used it and found an actual issue with the problem? If you randomly remember 1 time a month to plug it in when you go to lunch its effectively a non-issue. Consider it was well thought out.

The few times I've been asked to help my neighbors on the computer, it has been a non-starter because the mouse needed charging and we were unable to do anything because of it. :(

Yeah, but how long did charging actually take? 5 minutes?

? I don't actually know. Is an elderly neighbor and we couldn't find the cable at the house while I was there.

This sounds like a comedy of errors.

You're not wrong. :)

Wouldn't know as an issue like this makes it a non-starter since I'd never buy it.

Much more likely is that they studied the situation, understood the problem, and concluded that the trade off was worth it.

We may disagree—I think that surely there must have been some way of solving the problem—but it seems very unlikely to me that they missed it.

They just exercised judgement that I think was suboptimal.

Of course, they are selling millions of products and making billions of dollars doing it, so not sure I can claim I’m unambiguously right.


What’s so wild to me about this, is that the Magic Mouse design is 100% evident there was meant to be a flush charging dock sister product that got nix’d just like the recent wireless charging pad — there’s intention and a continuity one can follow, but recently has been cut at the knees for operational efficiency.

Because pedantry is king at hn. Design was not the most important aspect of the magic mouse, _aesthetics_ was. A key difference in this conversation.

Aesthetically the mouse is great, very beautiful, especially for the time it came out.

The design of the mouse is amazing in several ways, but very lacking in others, specifically ergonomics and charging. In fact, had they made the mouse larger, in all dimensions, they could have solved two birds with one stone. The mouse is too small to comfortable use for most people, making it larger would fit the human hand better and decrease the amount of contortion your hand has to do to use it. And with this extra height they could have put a charging port on the front! Win win on the design side of things!

There are several products you can buy that help with this, I have both wings and a palm bump added to my magic mouse that makes it much more comfortable. I tried to take one apart once to explore making a new case for it that addresses some of these concerns, but I ended up breaking it.

I think that final bit really is the crux, the scrolling and gesture support is so good I'm sufficiently motivated to try and solve the problems with the mouse. A beautifully flawed product.


> Design was not the most important aspect of the magic mouse, _aesthetics_ was. A key difference in this conversation.

On point. People too often confuse aesthetics with design.


Thanks for pointing that out! I'm not a native speaker, and around here (Germany) the loan word "design" refers only to designing the aesthetic side of things. Again what learned (https://www.nicolabartlett.de/again-what-learned/)...

And thanks for the article. Funny it ends the very recent, diplomatically infamous "liver sausage" :)

That’s astute. Also the exact reasons I use the magic trackpad instead of the magic mouse.

> Because pedantry is king at hn. Design was not the most important aspect of the magic mouse, _aesthetics_ was. A key difference in this conversation.

Be as pedantic as you want, but aesthetics is an aspect of design, not exclusive to it.


The comment doesn't assert that aesthetics is separate from design.

If I say "the world was not the most important aspect of the policy, _USA_ was", then that does not imply that USA is not part of the world.


I saw an article where someone rigged the magic mouse so that the cord was not on the bottom. They found that the mouse was unable to be used while charging. So putting the charge port on the bottom may have been driven by larger design decisions that made use while charging not possible.

https://9to5mac.com/2022/03/25/unnecessary-inventions-magic-...


> having the option to charge it and use it at the same time would sure be handy

That’s the theory. In practice, charging it whilst getting a coffee gives it enough battery life to finish the day. Sure, you need to remember to charge it then. But you never have to chose between using it or charging it for more than a couple of minutes in the real life.


How do you “forget”? You get plenty of warning notifications.

Its battery lasts for several weeks. So I never realize it's battery hitting close to zero. And unluckily its hit bottom several times at the beginning of a work day. It wouldn't have hurt if they made the thing useable while plugged in.

And the screen doesn't pop up a toast? It's an awkward fix but straightforward.

It does, but at the most inconvenient time always :D

I prefer the wheel of Logitech mx master 3.

Me too. It would be even better if we could disable mouse acceleration on MacOS, it's terrible and drives me nuts.

Since someone brought up trackballs, the scrollwheel on CST trackballs are awesomely, in the literal sense of the word, smoother than any other I've felt.

Apparently there's a Terminal command that can do this: https://www.isiko.de/how-to-disable/

One little trick I use is to turn the tracking speed all the way down in System Preferences, and then set it as high as I need in the Logitech settings. This makes the acceleration pretty unnoticeable for me. Also could use something like this: https://github.com/ther0n/UnnaturalScrollWheels

MX Master 3 has inbuilt mouse acceleration, even if you can set acceleration to zero in software it's still there in the hardware.

The magic mouse doesn't work at all for me because I want to be able to right click while left click is down and vice-versa for some special operations for some applications. Magic mouse only lets you click one button at a time.

I'm not a fan of the magic mouse because the ergonomics isn't good for your hand over extend uses of time. I wish Apple would make a "magic" trackball.

You could use the trackpad. The appropriate mouse size is different for different people (like headphones) because it depends on your hand shape.

> because the scrolling is so much better

Unless you use any application that requires more or less precise scrolls. Like any game, or any map in the browser. The experience of using Google maps / Apple maps / Openstreetmap with Magic Mouse is unbelievably shitty. Almost every time I want to make a click it interprets it as a scroll and zooms in or out. I eventually replaced magic mouse because of this with a regular mouse when I use an Apple computer, and it is much much better.


The free and horizontal scrolling benefits can also be found on much better mice. Logitech has some nice ones that allow both ratcheted and free scrolling with the click of a button.

>reviving the “computer for the rest of us”

That's just the spirit of Apple and the place where Steve Jobs wanted it to be.

The genius of Tim Cook lies in doing that while keeping 30-40% margins on the products they sell.


None

Cook scaled up the iPhone (and its broad ecosystem, which includes the iPad, iWatch, iServices), maximizing its global potential.

I don't know if it qualifies as genius, however it's plainly obvious that Cook is an exceptionally competent manager and logistician. He has also been a stable hand on an increasingly sprawling behemoth that easily could have gotten out of hand (~$400 billion in operating profit every five years - in the wrong management hands that's a disaster waiting to happen). He's successfully operating a mass consumer product company with the profit of Saudi Aramco, a once unthinkable outcome (Aramco used to tower over the profit of most every other corporation globally). Ten years ago Microsoft was one of the few global profit giants with ~$20b in annual profit; Apple is near six times that scale now. Cook has done a relatively good job of keeping a leash on Apple's financial behavior.


This. I had doubts about Cook until the new M1 macs came out and I realised he’s doing just fine. Cook is no Steve Jobs, I think of him more as a Bill Gates figure - a relentless executioner. But Cook has made a better and more ethical company than Gates ever did.

It’ll all change when the AR revolution kicks in and it will all depend on how apple deals with it. If they succeed there again, then Cook will be one of the best CEOs in the business. He’s got his own way but underestimating him after the M1 chip and successful transition seems foolish to me.


The AR revolution has been promised since the 80s.

And?

I think this is a good analysis. He has an incredible set of talents, they just aren't the same as Steve's.

> >reviving the “computer for the rest of us”

> That's just the spirit of Apple and the place where Steve Jobs wanted it to be.

Eh, there's nothing that points in the direction that Steve Jobs wanted Apple to be computers for the rest of us. Apple was always marketed as expensive and exclusive. Which is ironic because Apple devices are clearly mass products, with design that has little variation over the quite limited product line, and almost zero variation between Apple products owned by people. Somehow Apple made people believe it is hip to own the same thing as everybody else, and feel privileged about it, which defies all logic.


Steve Jobs literally said that several times. But he wasn’t referring to cost. He was referring to ease of use. Especially in the 80s the difference between graphical and text interfaces was a massive gulf. 90% of people simply could not be productive with a text only interface. Even the good ones they had on the AppleIIs. GUI changed everything. That’s what Jobs meant by a computer for the rest of us.

I can only quote Steve in response to this silliness:

”Most people make the mistake of thinking design is what it looks like. People think it's this veneer — that the designers are handed this box and told, 'Make it look good!' That's not what we think design is. It's not just what it looks like and feels like. Design is how it works.”

Citing the mouse thing is a prime example of this silliness. First, it ignores that it doesn't matter whether the charging port is on the bottom, because that mouse cannot operate when it's plugged in. Second, it ignores that you can charge the thing in a few minutes, so, again, it doesn't matter that the port is on the bottom.

Only people who don't actually use an Apple Mouse think that this matters.


But to be fair most wireless mice _do_ work just fine when plugged in, so the failure to make a mouse that for some reason can’t work while plugged in is also a part of their design.

Apple’s own wireless trackpad and wireless keyboards work while being plugged in.

I think the sleek visual and ergonomic design specifically resulted in the charge port being put on the bottom, which resulted in their decision to disable the mouse when it’s charging. Not the other way around.


Many people I know leave their trackpad and wireless keyboards plugged into the Lightning/USB ports all the time. This is basically fine for a keyboard or trackpad, since very little of the functionality of those devices benefits from mobility.

By contrast, a wireless mouse is fundamentally a different animal than a wired one. If (say) 90% of your users are going to leave it plugged in 100% of the time (to offset against the ~3 hours per year when they'll actually need to charge it), then a huge percentage of your users will experience an inferior product. It's like buying a mobile phone and then encouraging users to leave it plugged in by their bedside all day.


I’m not sure I understand this argument. Why can’t users simply use the mouse wirelessly even if it’s possible to use in a wired way. Are you claiming that people would end up leaving the mouse plugged in, like they do with the trackpad? If the point is that using a wireless mouse is a better experience, then why wouldn’t people use it wirelessly except to charge?

I have an Apple keyboard and trackpad that I use at my TV wirelessly but have benefited from the ability to use while charging a few times when the battery died. Doesn’t stop me from using it wirelessly 99% of the time because the experience is better.

By the way, I have an older phone I leave plugged in all the time by my bed, the battery life on it has deteriorated but it works perfectly for music.


> why wouldn’t people use it wirelessly except to charge?

For some users, the point of a wireless mouse is to be a portable mouse. You can take it on the go with your laptop without worrying about cables. When they are at home, they don't care about cable drag, and they want the peace of mind that the mouse will always have full battery when they need to go. They leave the mouse always plugged in when they can, so they can use the mouse unplugged when they need to.

I think most modern mouses have long enough battery life now, that more people can have the peace of mind without constantly docking their mouse. If the battery lasts weeks, it feels pretty secure.


By the same token, you could make it impossible to use your phone while it's charging, but that's rightly understood to be a terrible idea that's not even considered. Rechargeable game controllers pretty much all allow you to plug them in if you want while you use them, and in some cases to just drop the battery entirely and use them as wired. I recognize that neither of these are perfect analogs, mostly because both of these need to be charged much more often.

I can't really decide if the Magic Mouse thing was primarily an aesthetic decision to have no visible ports, or a decision that the user couldn't be trusted to use it "correctly" if it allowed itself to be plugged in normally, or an equal mixture of both. Either reinforces some typical Apple stereotypes which is why it keeps getting brought up even though it needs to be charged relatively rarely.


No one would assume that a mobile phone requires to be plugged in to work. With mice it is not as clear cut. And looking at my magic mouse, I wouldn't know where they would put a charge port without making its wireless use worse. So I prefer it as it is. Would be nice though, if wireless charging would be possible.

Yes, but to make one that works when plugged in takes some extra considerations and might cost a bit more

For example, having the port in front would change the overall design of the magic mouse, would mean that it would be thicker (at least in the front) and might also fight for space with the click switch

But yeah, given it charges to usable in a couple of minutes, it's a non-issue


In practice, there is no problem to be solved here. I have used a Magic Mouse for at least five years. It has never run out of juice while I’m using it, and I simply plug it in at lunch or overnight or on a Zoom call for a bit when I start getting notifications about the battery. It runs for three weeks of daily use without need for a charge.

Five years doesn't seem like a very long time. My intellimouse is 20 years old. The battery will never wear out because there is no battery.

I know plenty of apple ecosystem centric people who used an apple mouse and hated that design. So much so they gave up and got another one eventually. It’s clearly a bad design because there is an obvious alternative with no downsides.

A better example though might be the mess that is the design, or rather lack thereof, of the notification system and haphazard gesture meanings in iOS. I use both a Pixel Android and an iPhone, but mainly the iPhone these days and it’s clear that Apple don’t get everything right.


So your argument against the bottom charger beign bad design is that the mouse is also designed so it can’t charge and operate (contrary to the keyboard for example)? That’s kind of circular :)

>Design is how it works.

But the design of having the charge port not on the bottom would not negatively affect how it works. In fact, the port on the bottom goes against "how it works" philosophy.


I own a magic mouse 2. Honestly the charge port is the least of its problems. Apple does not like to put a second switch in their mice, so right clicking is done entirely in software and touch. The problem with this approach is that it's not possible to click both buttons at the same time, it also makes it harder to trigger because there's only 1 switch and it's on the left side. Plus it's just uncomfortable and has these plastic rails that make horizontal movement harder on some surfaces. It's just all around an awful mouse.

Apple has frequently ignored how it works for looks and still do. They would for the longest time stick all the USB ports on the left hand side of the laptop very close together such that a larger USB stick or a 3G dongle or whatever would block your only other port. They do things like awkwardly hide the power button on the back of iMacs, no fronting facing or side mounted ports, etc.


The magic mouse is by far my favorite mouse and I prefer it to the fancy Logitech MX I used before. No other mouse gets scrolling as right as the magic mouse does. Why would you want to be able to have a simultaneous left and right click? Be sure to check out BetterTouchTool to set up more guestures for the mouse, like a 3 and a 4 finger click.

> Why would you want to be able to have a simultaneous left and right click?

Video games.


Chording mouse clicks was common on machines at PARC and on lispms, long before the Macintosh was even designed.

You can do that with the magic mouse, you can have separate settings for 1,2,3,4 finger clicks, equally for swipes with 1-4 fingers.

What do you like about the scrolling? It's not IMO better than a free scrolling wheel and you don't get the ability to have ratcheted scrolling which is handy for scrolling through drop down boxes or anywhere where you need to scroll an integer number of times.

You need both mouse clicks at the same time to play pretty much any modern FPS and many other types of video games. When a dollar store mouse can do it I kind of expect an $80 one can. Ratcheted scrolling also helps there. There's probably other applications that need it.

I shouldn't have to spend money on a third party app to fix an $80 mouse.


Because it scrolls both in y as well as in x direction. A lot of software supports more than one scroll direction. The magic mouse makes that a very natural swipe across the surface.

With BTT, I can define all kind of guestures on the mouse (some of it can be defined with the MacOS settings, but BTT gives even more freedom). 1-4 finger clicks, 1-4 finger swipes, you name it. No other mouse offers this.

For gamers, I probably wouldn't recommend the magic mouse, though I am fine with it for StarCraft. But for all other tasks, I just love it.


> Only people who don't actually use an Apple Mouse think that this matters.

I actually would use an Apple mouse and I think this matters. I don’t use one because of this issue. See my response above; tl;dr: I don’t ever want to leave the mouse unplugged.

> First, it ignores that it doesn't matter whether the charging port is on the bottom, because that mouse cannot operate when it's plugged in.

Yes, and this is a fundamental design flaw and it should be fixed.

> Second, it ignores that you can charge the thing in a few minutes, so, again, it doesn't matter that the port is on the bottom

Interrupting my flow for a few minutes is really unacceptable.


> Only people who don't actually use an Apple Mouse think that this matters.

You got that backward. The people who are annoyed by these shortcomings don't use an Apple Mouse.


> That's not what we think design is. It's not just what it looks like and feels like. Design is how it works.

Which is exactly why many Apple fans had become so frustrated with bizarre "form over function" design in the waning Ive years. Even if the butterfly keyboard hadn't been vulnerable to making an entire computer unusable due to a tiny speck of dust, the "tapping on glass" feel of it was the prime example of "usefulness be damned" at the alter of Ive's bizarre obsession with thinness. If anything, the most recent MacBooks, where they just essentially undid all the bad decisions of the previous 3-4 years, were an admission that they had chased weird design obsessions at the cost of "it just works".

The other commenters have already pointed out the absurdity of a mouse that can't be used while it is charging.


> Only people who don't actually use an Apple Mouse think that this matters.

I use one (several, actually — that’s what work provides) and I absolutely think this matters.

I forget to charge it. I don’t check the battery level. I don’t remember to plug it in at the end of the day.

I also shouldn’t have to. That’s an annoying, unnecessary cognitive load.

If it dies in the middle of something important (and it does), stopping for some unknown number of minutes to let it charge is not a suitable option.

Currently, my fix is to have two magic mice; when the battery in one dies, I swap it out for the other one.

This is the only solution that doesn’t offload the cognitive burden of remembering to charge something at the end of the day onto me.

This is also a ridiculous solution to that problem.


> Let’s not forget that the vaunted focus on design has given us a mouse with the charger on the bottom as well as innumerable other botches over the years.

If the charging port was not on the bottom, a huge percentage of users would just leave it plugged in. Then a wireless mouse would functionally be a wired one.


What are you basing this assumption on?

The fact that everyone I know leaves their Apple wireless keyboard plugged in most of the time. (Sample size N=~5.)

Perhaps his personal experience. Which is the same as mine.

When I was still working in the office pre-pandemic, everyone with a cordless mouse kept it plugged in all the time, presumably out of fear that it would discharge.

They turned a cordless mouse into a corded mouse. The only difference being that instead of plugging it into their computers, they were plugged into their monitors or hubs.


Yeah? So? This is the hubris of the designer assuming the thing will always be used exactly as the designer intended. However, in the real world, people tend to not know the designer's intended use or just don't care what they intended. Users will use thing however they want/need. There's nothing wrong with that. In fact, I put the "wrong" as "hmm, didn't think of that" on the short sitedness of the designer.

It should not surprise you that Apple has hubris and strong opinions about how their products should be used. You see this in everything from the design to the features they support (and disable). This certainly existed in the Jobs era and it still exists in the Cook era, and they’re not “design features” in the sense that it’s only about appearance. The allegation of the previous post was that the charging port was purely a design feature rather than a product-experience one. And further that it’s a “botch” rather than an intentional feature that may actually be achieving a useful goal that the post author just doesn’t assign value to. To be a “botch” would imply that people dislike the device and it’s a sales failure.

When the CEO of the company gives an official response as "You're holding it wrong", I was never under the impression that Apple was concerned with anything other than their opinions

My personal experience with AirPods used on a MacBook was so bad I gave up and went to a wired headset. Always having to fiddle to make it.. you know, make sound, the most basic of its functions. The same company building both devices and the OS, what a shame.

Interesting. I assumed this would be a problem, and was astounded at how well the new wireless buds work (using Pixel Buds, but I imagine if they're any better than AirPods, it's not much).

What happened? Never had issues with basic AirPods. Works for me on the iPhone, iPad, MacBook and on a Windows PC with basically zero configuration.

Bluetooth on desktop OSes has been extremely hit or miss with me.

On Windows, none of my headsets or controllers would automatically connect, most of the time requiring me to pair them manually every time.

On Mac, pairing would generally stay remembered but I'd typically need to manually select the device I want to connect to in the Bluetooth menu.

On Linux, prior to Pipewire, I had to close all of my audio apps to connect Bluetooth headphones. Now with Pipewire, the connection is so "sticky" that I have to turn off Bluetooth on my desktop if I want to use it on any of my other devices, since it always steals the connection.


The AirPods on MacBook are definitely fiddly but switching sound output to AirPods always fixes all problems if switching audio doesn’t work for any reason, and the ability they have to sync to watch (for workouts), iPhone, iPad, Mac, and AppleTV at the same time is totally bonkers even if the end result is somewhat flaky.

I also appreciate that Apple leaves a highly amplified headphone jack on their MacBook pros because there’s tons of Pro applications where Bluetooth headphones just don’t work due to latency.


If you wanted to give wireless bluetooth headphones another try... In my experience, the better Airpods are made by Xiaomi (their top-of-the-line 'Airbuds 3 Pro' work out of the box, are relatively comfortable to wear, and are inexpensive, coming in at around 50 dollars).

I agree with you on all points. The Apple Watch is an especially significant universally appealing product for people who can both afford it and a monthly data plan. For me the killer feature is being able to leave my cell phone at home and still get messages, calls, and see email headers. I don't know about the rest of you here, but I struggle a bit separating my human-ness from technology. Not having the temptation of looking at a cell phone when I am running errands and hanging out with friends - but still be in communication - is really good for me. Yeah, if I had more will power and self control then the Apple Watch wouldn't be as valuable to me.

Maybe they will fix the Magic Mouse similar to how the original iPad pencil was really weird in that to charge it (unless you had a power brick) was to connect it to the iPad's lightning port.

It could be that the next Magic Mouse revision could address that but I don't think that could really be counted upon. It would make sense though if that the Magic Mouse two or if there was a total revamp of their peripherals got rid of lightning ports and either could be wirelessly charged or charged by USB C


It was stale when Ive was there.. Normally esthetics should follow function. Not the other way around. I'm glad Ive's gone. Apple is finally making great stuff again.

I think it got stale when Ive didn't have Jobs as a counterpart. Or vice versa. Once Cook & Ive didn't seem to have the same relationship, things stopped working. But I agree that Apple is doing quite well again with the M1-era Macs.

Yeah he def. needs the counterpart. Design needs constraints ;-)

I would refine the above by saying:

Apple's biggest and most persistent strength has never been visual design, it's been holistic UX design. Those often get lumped together but they are different. Letting either visual design or engineering take priority over the user experience is the big pitfall, and while Apple doesn't have a perfect track record here, they're not even in the same galaxy as most of the tech world. I would even say that in the post-Ive world, they may be at an all-time high.


There have been some notable failures - Where they've got the tradeoffs between form and function badly wrong. * Butterfly keyboard * iPhone headphone jack * Single-USB Macbook Air * Non-removeable battery used to really bug me, but the M1's incredible battery life has sort of mitigated that now. The surprising thing to note is - Since Ive's departure, the design decisions have been generally better, and led by great engineering. The M1 Macbook Air is a beautiful bit of industrial design that's a joy to look at and use - The last clunky bit, screen bezels, is a job for the engineering team too. I think that the Apple of today wouldn't have ditched the iPhone headphone jack, that was the design team drunk on their earlier successes (Removing Floppy / CD drives just slightly ahead of the technology curve), making the wrong call about where the technology was up to. They won't undo it now (it's embarrassing, and there's good money in selling the only bluetooth headphones that you don't spend your life pairing and unpairing), but the UX (For someone who sits at their desk 99% of the time) is clearly worse than a headphone cable.

>Rather than the Apple watch being a Vogue-celebrated product for the 1%, it’s an attractive and high quality product for many people. I see it on the wrists fashion icons and on the wrists of people working at my local grocery store. Billionaires and ordinary people can have basically the same phone, watch, and AirPods. That is the true genius of Cook, reviving the “computer for the rest of us”.

Reminds me of that famous Andy Warhol quote from 1975:

What’s great about this country is that America started the tradition where the richest consumers buy essentially the same things as the poorest. You can be watching TV and see Coca-Cola, and you know that the President drinks Coke, Liz Taylor drinks Coke, and just think, you can drink Coke, too. A Coke is a Coke and no amount of money can get you a better Coke than the one the bum on the corner is drinking. All the Cokes are the same and all the Cokes are good. Liz Taylor knows it, the President knows it, the bum knows it, and you know it.


The same is true of Bovril

> the tradition where the richest consumers buy essentially the same things as the poorest.

That's possibly one of the best "fooling you while telling the truth" in print.

Just think of the travel experience of the poorest consumers versus the richest.

(The trick in the Coke thing is the petitio principii that you'd want to drink it.)


Travel being cheap is good. You don't get on a plane for the experience of being on a plane, it's to get to the destination without losing time * money on the way.

Rich people have higher security needs, which is one of the reasons they don't fly coach.

Of course, sometimes they do get places a lot faster on private jets.


Lots of folks pointing out how the design is now “pragmatic”, which is exactly why it’s not at all interesting anymore.

Current Apple is never again going to give you something like the “Luxo Lamp iMac”.


Good point about Apple creating great products that broad swaths of society can afford.

I do take issue with people thinking Apple’s designs are stale. I mean, to an extent this is a matter of taste. My take is that at their best Apple’s designs are so good they are timeless. Is a vintage Leica stale? Is a 1964 Porsche 911 stale? No. People who think these are stale have bad taste I would argue.


<looks at the broken keyboard on his 2018 emoji macbook pro>

... and you're missing Ive... why?


None

"a shift in strategy that has made the company better known for offering TV shows and a credit card than introducing the kind of revolutionary new devices that once defined it"

I nearly stopped reading here. Does anyone think that's true? Sure they make much more money than they used to but to everyone I know apple is still very much a hardware and product company.


The sheer ubiquity of AirPods in ears I see at the gym would sure suggest that people know they are a hardware company, and have in fact bought new hardware products introduced post-Jobs.

Regarding the title with "technocrats" reminded me of Bret Victor referring to it as a "designer aristocracy" - http://worrydream.com/#!/DynamicPicturesMotivation (2011 , mentioned later on in the essay)

--

The 2011 Bret Victor essay: "I spent a few years hanging around various UI design groups at Apple, and I met brilliant designers, and these brilliant designers could not make real things. They could only suggest. They would draw mockups in Photoshop, maybe animate them in Keynote, maybe add simple interactivity in Director or Quartz Composer. But the designers could not produce anything that they could ship as-is. Instead, they were dependent on engineers to translate their ideas into lines of text. Even at Apple, a designer aristocracy like no other, there was always a subtle undercurrent of helplessness, and the timidity and hesitation that come from not being self-reliant."

--

The 2022 NYT article "technocrats triumphed at apple": "Mr. Ive’s absence, the designers say that they collaborate more with colleagues in engineering and operations and face more cost pressures than they did previously."


> the designers say that they collaborate more with colleagues in engineering and operations and face more cost pressures than they did previously

Great! This is how it should be IMHO.


This is actually a thing in design, and it's inexplicable. Too many designers think like architects. They design the case to look good, and aren't interested in the much more demanding process of productising an item so it looks good and isn't feature constrained.

A lot of design is pure fantasy. There were some renderings of Mars habitats being sent around a year or two ago, and they had wood panelling - pretty for sure, but not exactly easy to find on Mars in industrial quantities.


If you’re living on Mars in the first place, you’re going to need to grow some plants just to have food. I don’t think it’s inconceivable to also grow some bamboo.

> Too many designers think like architects.

How would you know how architects think? Are you an architect?


"I will have you know I have seen every episode of every series of grand designs"

> But the designers could not produce anything that they could ship as-is. Instead, they were dependent on engineers to translate their ideas into lines of text.

Why is this scenario presented as a bad/uncommon/exploitative thing? It is exactly how architects and engineers work on buildings, etc. and is considered a functional (as opposed to dysfunctional) mode of working. It's certainly not a characteristic unique to Apple.


Is the way architects work on buildings functional? It appears architects have no influence on most buildings out there in the world, until one day they design a giant multi-billion dollar sports stadium that looks like a body part[0]. There's nothing in the middle, mainly because in the US we essentially banned building anything interesting around 1970.

"A Pattern Language" was supposed to be about designing buildings that the inhabitants could then adapt for their own purposes, but it's unclear if this actually ever happens.

[0] https://www.theguardian.com/commentisfree/2013/nov/18/qatar-...


Exactly the same complaint could be made about the civil and structural engineers that this website so idolizes, right? All they can personally make are plans.

The plans produced by the electrical and civil engineers I work with are a set of instructions for electricians and construction workers to follow that results in a power plant. I have not worked with an architect but I don’t think their drawings are ready for trades.

This article is from some clown that wrote a book with an opinionated story (this same one) about Apple.

It’s an advertisement for his book and dovetails with NYT’s crazed and rabid need to attack the tech industry that diluted their chokehold on discourse.

There is nothing serious going on here.


On the flip side, one wonders what the departure of Scott Forstall meant for Apple, and where it would be today if he continued to be in a decision-making position.

I think Forstall’s trajectory after Apple shows that he was ready to be done with being a technology leader. If he had wanted to stay, he could have.

> In the wake of Mr. Jobs’s death, colleagues said, Mr. Ive fumed about corporate bloat, chafed at Mr. Cook’s egalitarian structure, lamented the rise of operational leaders and struggled with a shift in the company’s focus from making devices to developing services.

I think this is particularly clear when you compare Apple's current product introduction keynotes to ones from the Jobs/Ive days: nowadays, they to forego the bit where they talk about how the device was made.

Ex: https://www.youtube.com/watch?v=5SjIuzhdd_g (Apple Watch Steel introduction video, with a heavy focus on how it's made)


I had completely forgotten about these. I have to say I do miss this kind of detail in the presentations these days, even if I like the products of late way more than the ones that came in the latter half of the 2010s.

Jony Ive was Steve Jobs’ extension. Steve needed Ive at Apple but Cook didn’t really need him because Apple fast became something entirely different since the introduction of iPhone. In my view, Apple will always be in its best form guided by engineers. I think there was a period (a couple of years since Jobs’ passing) where Apple was playing “What-would-Steve-do?” and I am so glad the executives leapfrogged that mentality of chasing after an icon’s shadow.

Car design is much more interesting than the wholly homogeneous world of mobile and computing. Hasn't anyone come up with a nice idea for 15 years?

Franz Von Holzhausen has been doing great work over at Tesla, albeit as Elon's editor, rather than the inverse Ive/Jobs relationship.

Elon: let's make a truck out of folded steel.

Franz: OK, I'll make it look decent.

As opposed to Apple, where it was more like

Jobs: let's make a nice laptop.

Ive: OK here's a block of aluminum

Jobs: bro, it needs a screen

Ive: Fine I guess the users aren't evolved enough yet.


The article leaves out some of the consequences of focusing on design over practicality and usability. The MacBook Air keyboard had become a disaster. It was as close as one could get to just typing on a hard surface. It was around this time that I switched to a Surface laptop because it felt to me Apple was giving up on their own laptops in favor of the iPhone. Fortunately, they’ve now fixed this.

I think Apple has lost its mojo when they have started to sacrifice function to form, to make their Laptops even thinner. Remember the "revolutionary" butterfly keyboards that could not survive outside a clean lab? That was the moment.

Btw, the watch struggled rightfully so, because the battery still does not last a day. I can imagine how hard it was to sell it as a fashion accessory. It needs daily care, charging at least one time a day. Staggering! I have a different smart watch product that lasts a week! Now that's a fashion accessory!


I think the problem that happened with Ive's designs is that he kept trying to out-do himself in the same direction. I've was important with regard to making really nice products that felt really good. Even a plastic iPhone 3G felt really solid compared to the creaky Android phones that would keep coming out years later, never mind an iPhone 4 with its amazing metal and glass feel.

However, Ive kept wanting to push things in the same direction. Apple made wonderful and thin MacBooks that were solid with unibody enclosures. I remember the thick, creaky, plastic PC laptops of 2008 and the MacBook Pros were just amazing in comparison. Later, Ive wanted to shave 0.25mm worth of keyboard space and we ended up with MacBooks that no one wanted.

I think labeling this as "the technocrats won" is way overstating the case. Ive's legacy is all around Apple's new products. It's in the Mac Studio which is a small and quiet machine made out of nice materials. It's just a tad more balanced with the practical implications of managing heat. Instead of trying to make the Mac Studio as small as humanly possible, they've made it small and nice. It isn't anything like the mini-towers that are typical. The new Apple Watch really pushes the display to the edge. It's amazing.

I think part of it was that Ive didn't have a lot of places to go. He'd won. Apple had moved over to his way of thinking almost entirely - with tiny exceptions like "I'd like a functional keyboard." The industry has moved over to his way of thinking a lot. Android phones aren't creaky plastic nearly as often - you can get ones with nice materials and build quality. Once everyone is won over to your way of thinking, where do you go?

In fact, I think a lot of people really like attention. For a long time, Ive got attention. He'd get positive attention from Apple fans who loved his nice designs and negative attention from those who would complain that the iMac didn't have a floppy drive or whatnot - but he was sure he was correct. Fast forward to 2016 and what was Ive really doing that would garner such attention? Apple's product line was all Ive'd. The industry had copied him in a lot of ways (even if they were potentially bad copies). In a way, he wasn't a thought-leader anymore because people had all accepted his thesis. If Newton were around today talking about gravity existing, we'd all be like "yea, we know...got anything new?"

As time went on Ive would either need to find some amazing new way of pushing things forward or his work would just be passé. Oh, another unibody MacBook Pro. Oh, another computer like the last one. He didn't have a battle to fight anymore.

Back in 2000-2010, he could be telling engineers "you need to make it this way because it's better" and most of the time he was right. Once he'd proven out the fact that he was right over that decade, everyone was on board because they saw the value. What would the next thing be that he was right about? Maybe there wasn't a next thing. Maybe they'd taken computers to the right level of design.

Apple's whole lineup is basically Ive's legacy - with a tiny bit of extra room for a decent keyboard or cooling.


What if I want a "creaky Android phone" so I can, like, replace the battery? There are advantages to alternative designs, too.

Rock star designers leaving big companies is nothing new. I happen to like "Bangled" BMWs like the original Z4, before it was watered down to a retro-ish design. But I also know I'm in the minority.

Apple's primary advantage is gaining unique capabilities and protecting them through domination of the supply chain. Not all of these succeed (vide large sapphire crystals) but these kinds of competitive moats are actually more important than unique designs.


I’ve always gotten the impression that Ive may be a truly great designer be he needs an editor. He had that in Steve Jobs.

Once he lost his editor the designs Apple shipped moved more and more towards being perfect designs at the expense of thoughts of usability.

And now he’s known as the guy who helped “ruin” Apple’s products until they kicked him out.

Unfortunate. For the lack of an editor.


I think your comment on Jobs being Ives editor is spot on. They worked as a team and complemented each others shortcomings (at least to a degree). There was apparently never this kind of relationship between Cook and Ive.

But to say Apple kicked Ive out seems wrong to me. They rather wore him out and he left.


We can speculate why he left. But I think a lot of people agree on the observation that he had probably been irrelevant for a good while by the time they parted ways.

Not so much irrelevant as superfluous, and possibly a negative influence.

To give credit to Cook he listened to the chorus of complaints and acted on it. The results aren't as pretty as the Ive era designs. But they give users a lot more of what they really want, instead of a nice case with missing useful features.

Ive actually had a lot of misses, from the gradients and flat look in iOS 7, to the infamous butterfly keyboard and missing USB ports, to the early versions of Watch, to (at a guess) the touchbar. There were also Jobs-era failures like the hockey puck mouse and the Siri Remote for Apple TV.

And personally I'm not a huge fan of the current Apple typography and branding.

So - not really missing his influence. I'd love to see Apple find a new design head who could inject more personality than the current products have, but I don't think Ive's departure was a terrible loss in any way.


>The results aren't as pretty as the Ive era designs

Not saying Ive hadn't had some great designs in his time at Apple but saying the new designs aren't as pretty is highly debatable. From what I can tell, I don't think many people found the touchbar to be particularly "pretty".


If there is one thing the Touch Bar is it is pretty.

Aesthetically it changes the entire look of the keyboard and wraps it together.

My next computer is having function keys though, so make of that what you will.


    Ive actually had a lot of misses, from the gradients and flat look in iOS 7
Lot of people thought that was a major win, compared to the childishly skeuomorphic interface (largely a product of Jobs' taste) that it replaced!

Once it settled down, agreed. But iOS 7 initially swung too hard at a look that was not as discoverable or accessible. Thin font, UI elements that were hard to distinguish, low contrast.

https://www.macworld.com/article/221357/why-ios-7s-design-is...


> The results aren't as pretty as the Ive era designs.

I'm not sure they're supposed to be. The design aesthetic of the new MacBook Pro is...chonky. And that has to be intentional. They've made a work machine. It kinda looks like one. And despite being very clearly of the same design lineage, the MacBook Air doesn't have that vibe, to me. It's a much lighter-feeling thing that feels closer to the older designs.

> And personally I'm not a huge fan of the current Apple typography and branding.

The font kills me. It's just not nice to read.


Yeah, Apple would have made him irrelevant so he could go without tanking the stock.

> There was apparently never this kind of relationship between Cook and Ive.

That seems obvious, Cook was never the "taste" person, which is what Jobs was (brutally so). Cook has always been the ops guy (his original role was SVP for worldwide ops, then EVP for sales and ops, before becoming COO).


None

This is not how design process works. There is not single genius outputing some designs that get implemented. There are teams of people making and testing thousands of variations. Dont think these teams dont know about the issues with their designs. Theyve dealt and solved questions we cant imagine.

Ive as design lead already is the editor. Thats job of art director.

What Jobs has done for Ive is that he probably pushed many other departments to do what Ive and his team came up with.

Its very likely Ive “fails” are organisation issues and his inability to push things to completion.

I am not a big fan of Ive but Apple turned their trajectory completely around because of designs comming from department he was leading. And the big turn around the old iMac was comming mostly solely from him.

I wouldnt give so much credit to Jobs in same way as i wouldnt give so much credit to Ive. They are just on top.


I think OP means that without the influence/editing of others this led to asthetic-centric thinking rather than including the right balance of product/function-centric thinking, and the 'editing' was that Jobs could influence him to ensure the right balance was in the products.

That’s an excellent way of putting it.

No you can read my other comment. I dont agree

And I think this is very narrow view and not how it works. Industrial designers are not some artsy bitsy people who try to draw beautiful things. They are mostly engineers who try to deal with people problems of the product.

If your experience with designers are some UI juniors who got the job after being in the field for less than 2 years it might seem like it.

But designers doing those products are really juggling hard problems and requirements that other departments give them. Everything is a tradeoff and they are the ones 100% on side of usability.

I am sick of this perception that that products made by engineers are functional and products made by designers are pretty. Its designers who make it people functional and its engineers who whip out the specs and clever tech solutions.

The aesthetic is sideproduct.


> Everything is a tradeoff and [designers] are the ones 100% on side of usability

Designers shouldn't be just 100% on the side of usability, they should be balancing multiple factors such as cost, aesthetic, durability, portability, accessibility, ease of manufacturing as well as usability (there are many more).

All of these things have trade-offs - making a product more durable might increase cost, making it easier to manufacture might hurt it's aesthetic, adding more ports might improve functionality but decrease the aesthetic and increase cost etc.

The job of a designer is to balance all of these factors. The argument is that the balance Jony had was skewed more towards aesthetic which resulted in trade-offs in other directions.

> The aesthetic is a side product.

IMO the aesthetic definetly isn't a sideproduct for Apple - it's clear that aesthetic is a core part of the design process (see the iMac G3, G4, Trashcan Mac, latest iMac, Homepod e.t.c. where a significant portion of product engineering is driven by the external aesthetic to a large extent). The design of the iMac G4 was no accidental side product of the tech.

The criticism of Ive is that he weighted aesthetic too high at times, which for example resulted in an obsession for product-thinness which compromised the usability/durability of the keyboard, and of un-obtruded sides which led to the removal of ports and a charge-port on the bottom of the magic mouse.

You can still be a great designer and have a bias towards form over function - and arguably the best designs are done where there is some sort of creative tension between form and function (as tension often breeds creative solutions to release the tension).


We will never know how things were decided. Ive might have been idiot forcing thinest macbooks and ports on bottom of a mouse. Or these things were requirement from somewhere (make thinest laptop, make mouse without ugly ports thats gonna be cheap). And final solutions were the best tradeoff.

By sideproduct i meant that nobody starts with aesthetics. It starts with the tech and its limits. Good designers can whip most things and shapes to look good.

I just dont think Jobs was mythical design hero because he took typography class.


Yes it is true designers take all those factors into account. However in this situation likely products were developed fully and then stripped feature by feature in requested revisions from non design departments. That’s what I’ve read typically happens in corporate design environments when the rest of the company shifts focus to general cost cuts.

> I’ve always gotten the impression that Ive may be a truly great designer be he needs an editor.

You aren't a brilliant designer if your design choices make a product less functional. And there's a long track record of Ive doing exactly that.


> being perfect designs at the expense of thoughts of usability

This is an oxymoron. Being a desinger doesn't mean that you focus on making things look good while someone else comes along and reminds you about "usability". A design isn't "perfect" if it has usability problems. Any designer worth their salt will be the first to tell you that.


I agree with your comment and would add that aesthetics impact usability.

If a design is for some reason repulsive to a user then it is not good design.

I think Ive and Jobs together raised the bar for tech so many times that its easy to forget what computers used to look like.


I sometimes feel like authors sometimes get this problem once they get famous enough -- the power dynamic between them and their editor shifts, and the author gets to stand firm for their creative vision and they end up publishing novels that they could have trimmed a couple hundred pages from.

they still were the best machines imo. Have to give them some credit. build quality was always top notch and it was “just” the keyboard that ended up having too high failure rates, and very meh typing experience.

it’s not like the machines were dumpster fires. I think we are too harsh sometimes.

but yes obviously the keyboard sucked and i hated that part of the laptop.

otherwise it was as good of a machine. great even. slimness is still nice in a laptop - it still performed as good as any other laptop I considered. 4x USBc was a great move too, especially charging from either side.

So undoubtedly new ones are better. but we are overly harsh on that era and Ive.

it’s not like he was the only one in on the keyboard decision either. Apple thought the keyboard was a competitive advantage at first!


You can tell finance took over.

They are deleting soft(ware) cultural relics because not enough people use them.

The many pennies saved!


This frames his departure as being about conflict with the 'accountants', and it doesn't mention the laptop thinness debacle. I have zero inside information, but I understood Ive to have been pushing for thinness, even at the expense of functionality. The company went down his path for a few years, but it has recently made an abrupt about-face. It seems strange not to mention this dimension of Ive's work and the possible conflicts it could have caused.

OTOH, this guy apparently conducted hundreds of interviews, and I'm just some guy who's been watching from the outside! Maybe I'm way off-base.


This is a “debate” that exists only on HN. Laptops are not what propelled Apple to become the most valuable company in the world and their importance inside the company reflected that until fairly recently.

I used to worry that Apple was slipping. Apple TV+ especially made me question the direction of the company. Then I got a MacBook Pro with an M1 chip. It’s honestly stunning.

Question: After reading some comments here, I still think I might be in a bubble? I think its all becoming better now that Ive is not responsible for then mac design any more.

All the mac hardware (mini, imac, macbooks) became smaller, less repairable and overdesigned to the extent of being impacted in usability after steve jobs died and Ive was running without counterweight:

The magic mouse you could not use while charging. The horrible keyboard that died from merely a few crumbs. having only two ports on a computer so you would always need a couple of dongles. Just to name a few.

Sometimes clever design has to be combined with boring choices, like still having a hdmi port and micro sd slot. A decent, resilient keyboard. Now I long for a macbook that has replaceable/repairable memory (ram, ssd) again as well as a battery that is not glued in place ...


> The magic mouse you could not use while charging

Do you think this just slipped past everyone at Apple?


> The magic mouse you could not use while charging.

I totally think they done this on purpose to prevent anybody to use the mouse while charging it. In that way, there isn't anybody continuously using the mouse without removing the cable, making the product look like is wired instead of wired, and breaking that minimalism design they want to.

Apple is a company that not only design product to look good in a shop window or an ad, but also in how other people will use the product and what image will give to the rest when its in use.


> In that way, there isn't anybody continuously using the mouse without removing the cable, making the product look like is wired instead of wired, and breaking that minimalism design they want to.

Is this a common problem? I only use wired trackballs but I notice other major brands of rechargeable wireless mice don't seem to care whether you use them plugged in or not.

What I think would have been better would be to put the charging port on the back instead of the bottom so you can still use it if you absolutely need to while charging but it's just awkward and uncomfortable enough to discourage doing so on a regular basis.


I haven't really seen other rechargeable wireless mice. They're usually battery powered so there is no cable to begin with.

I think the Magic Mouse's greatest design flaw is that it's entirely un-ergonomic. I swear I get carpel tunnel just looking at it.

It charges like 45m of use-time in like 30 seconds so I really don't think the port on the bottom is a bad idea. I've never used one long enough to truly say tho.


> I totally think they done this on purpose to prevent anybody to use the mouse while charging it. In that way, there isn't anybody continuously using the mouse without removing the cable, making the product look like is wired instead of wired, and breaking that minimalism design they want to.

There's no way apple is that petty or instructive. The wireless keyboard can be charged while in use, so your theory kind of falls flat.

I'm certain that apple just doesn't care. There were probably some tricky engineering problems during the design of the mouse which pushed the product team towards having the charging port on the bottom or it was a purely aesthetic decision — a recess / inlet for the lightning charging port probably didn't look great, and the designers rejected it.

So, they went with the port on the bottom. I'd wager the way Apple sees it, having to wait two minutes to get 9 hours of battery life if your mouse dies unexpectedly isn't the end of the world, and the battery lasts ~2-3 months on a full charge, so what's the problem?

I sort of agree with the hypothetical take I presented above. Yah, it would be nice to be able to use it while charging for those rare situations where you forget to charge it, but it's not the end of the world. Even if you disagree, it's astonishing to me how much ink has been spilled roasting apple for the magic mouse 2.


> The magic mouse you could not use while charging.

This is the biggest nonissue in computing. It takes literally 2 minutes to charge it enough for 8 hours of use so even if you forget for a month and are actually in danger of it dying, just stick it on the charger the next time you need to go pee and that will get you through the rest of the day.


It's one of those issues that nobody who's actually owned the bloody thing ever worries about. A couple minutes charge, it's ready. Then if you need more juice, charge it up again overnight or a lunch break.

Same as the first generation Apple Pencil: it spends 30 seconds in the Lightning port, it's ready to go for long enough. The only people who think it's a problem are those who've never owned one and assume Apple wants people to leave it there all the time.


Jony Ive is a hack without the firm hand of a Steve Jobs to keep his self-indulgent tendencies in check. He is responsible for the butterfly keyboard fiasco. Scott Forstall was famously fired for refusing to apologize for the Apple Maps first release woes, but AFAIK Ive never has for those garbage key switches, and I’m sure there are multibillion dollar class-action lawsuits working their way through the legal system.

It is notable Apple has not named a new Chief Design Officer. Ive’s failure is so manifest he’s destroyed designers’ seat at the table in the company that was the poster child for design.


I don't disagree with anything you've said however I think if apple were to name a new chief then that person would have to be as good or better than his/her predecessor. That's a lot of pressure from the public on day 0.

Well, Cook made sure designers never hold sway and cause damage again the way Ive did, the top design position at Apple (which is not at the C-suite or SVP level when relatively minor specialties like Machine Learning are represented) has been split between two people, Alan Dye and Evans Hankey, and they report to COO Jeff Williams, a mechanical engineer by training:

https://appleinsider.com/articles/19/06/28/who-are-alan-dye-...


Chief Design Officer isn't a "best designer" position, it's a "promoted through the roof" position where you actually do less work. Phil's current Apple Fellow title is the same.

The "realest" position is SVP, which admittedly no designer currently is.


How was this ever going to work?

If I worked for 20 years with one of history's most influential entrepreneur, and we had a deep working partnership sustained by mutual respect and trust, and then that person died, I simply don't know how I'd continue showing up every day.

Keep in mind that when Jobs returned to Apple, Ive was not at all influential within Apple and on the verge of leaving. To go from that place to one of the world's most influential industrial designers — gosh, I'd have some ego too.

I'm grateful for his contribution. And I'm also grateful that his departure seemed to have opened new avenues of creativity and flexibility of thought at Apple.

After all, it was Jobs himself at the Stanford connection who said that death (or, thought of another way, departure) is life's change agent.


No one talks about the mega disaster that is Apple Park. When you inside it mostly feels like hospital or airport. The glass cleaning is a major pain to do. Apple bought a quarry in Italy so that the wall tiles can line up or something. Will be fun ti see what happens when it gets damaged. The sinks are all carved out of one block of stone. Last but the least the chairs are 7k and are the works ever to sit on. There are som many apple buildings in Cupertino and collaborating with another team is quite painful. Ive without Jobs was definitely a disaster. After spending those billions on that building that looks like a tomb for Steve Jobs, I wonder what purpose was achieved.

It has always puzzled me. Why Steve Jobs chose his ops head over product ones, to succeed him?

It's interesting that the article attributed the shift in Apple Watch's marketing strategy to Tim Cook. I am pretty sure that Jeff Williams led that change. He is a smart guy who doesn't get the press he deserves, in my opinion.

Not much information, very verbose. The only valuable information is that Apple's strategy under Cook has shifted from devices to more profitable services.

I think he created the Apple look and feel which will stay with the company probably forever as something to distinguish it from all the other products in the marketplace. For this reason he should be celebrated. Sure we can blame him for the terrible butterfly keyboard and the nearly port-less MacBook major blunders, but in the end without Ivy Apple hardware would not be distinguishable from a generic laptop running Windows.

Without a rigorous and function-first engineering mindset it's dangerous to let designers run havoc. An Ivy without Jobs was doomed from the get-go.


Jony Ive‘s „design is everything, function is last“ approach.

I think Apple „grew up“ a little bit in the last years and realized, that they can not deliver any more or such substandard products.

- Like the Mac Pro (2013) which was thermal limited even with the launch configuration and could not be refreshed because more power would mean less power through throttling

- Magic Mouse 2 which well u could not use while charging

- Macbook Pro Touchbar which is there because there was nothing else to „innovate“

- MacBook Pro Keyboard which is so thin and lookin good that the owner has to replace it every 6 month


Back in 2017, many of us were frustrated with Apple's Mac offerings (price/performance/features). In other words, Apple seemed to be stagnating hard at the time, forcing us to move to Linux on better hardware.

I also remember people throwing out lines at the time, such as "Apple needs to focus on not making things thinner" or "Price with Apple is only an issue in the absence of value".

Since the introduction of the M1, Apple has regained all of that lost momentum in my opinion. And from judging Internet commentary, most others wholeheartedly agree. People wanted speed/performance/battery/ports rather than another millimeter of thinness. The only gripe I hear today with Macs surrounds the pace of innovation with macOS.

I think this article would have been better received if it were released before the introduction of the M1.

Moreover, the ending paragraph states "the designers say that they collaborate more with colleagues in engineering and operations and face more cost pressures than they did previously. Meanwhile, the products remain largely as they were when Mr. Ive left." I can't see how this engineering collaboration and cost accountability would be a negative thing for consumers or Apple, and the products are definitely a lot better (and faster!) than before Mr. Ive left.


Ive seems to have left because he was no longer powerful enough. He was likely burned out. He got a good amount of money out of leaving as well.

I don’t know what any of that has anything to do with Apple being run by technocrats? I don’t know if the person who wrote this truly understands how much creativity is needed to pull off the engineering feats that Apple seems to have been pulling off for better part of last two decades.


I listened to this author's interview on the A16z podcast. An important detail is that Scott Forstall brought him up through Apple over the years. Scott Forstall, the head of software for Apple, later lost a power battle after Steve Jobs' death to Tim Cook & Jony Ive. One wonders if this is coloring his opinion.

Jony Ive leaving Apple was there best thing that could have happened. The horrible keyboards, the form over function trash can MacPro, the removal of ports, the one port MacBook (“the adorable”), the gold Apple Watch, are all the fault ultimately of the design team under Ive.

Shirky's review is much better than the excerpt: https://www.nytimes.com/2022/05/01/books/review/after-steve-...

It was the Jobs Ive combo that worked. I think Ive was making too many form over function decisions towards the end which benefitted neither Apple nor its customers. Cook is a bean counter.

That does not neat to be all bad as much as I hate to say it. That is why I liken get he new Apple to Microsoft with all of he pros and cons that entails:

https://erik-engheim.medium.com/apple-is-turning-into-the-ne...


The article doesn’t delve its title that technocrats thrive at apple. It only focuses on what it calls “accountants”.

However is it possible that Ive himself was just done? With artists , either composers or of the visual arts, they usually have a stock of good ones they churn out and after that the output is usually a repetition or a hodgepodge their previous work. It’s possible Ive reached that point too as the iPhone kind of became an all in one computer that killed a lot of accessories, so any idea for a new kind of device is killed automatically. In that scenario, I don’t know what else is there after a watch.


> [...] struggled with a shift in the company’s focus from making devices to developing services.

Boy can I ever empathize with that.

This shift is the worst thing that ever happened to Apple. Personally, it was maybe the biggest reason I got disillusioned working for them and quit (I should note I was only a Retail employee, nothing big).


This is where most companies are trying to go if they can though. Unfortunately it has been proven that people will willingly pay for a service every month above purchasing it outright and shareholders love it because it means that revenue from each user is consistent.

The future is one where you don't own anything and companies nickel and dime you for features and services.


Don’t they do extensive user testing on these products ? Surprising some of the design decisions they make with the amount of resources at their disposal

What a weird take.

A lot of the changes the "accountants" have made recently are responsive to user request to my eye.

So so many more ports on the new products. I absolutely love this. I hated the pure design direction things were going under Ive.

Battery life once again going big. I love this. Usability again the key.

The trashcan Mac Pro? Too much design. Give folks a square box if you need to crank power up.

Self service repair program? Only one out there from what I can tell? I like it.

Iphone mini - yes I was one of the few folks who bought this. Sorry to see this go away, I like the small phones. But giving a range of options was great.

And this list goes on. My wife keeps her stuff (iphone / macbook) around FOREVER - software updates on phones going back 5+ years? Fantastic.

Their run rate is now something like $400B++ per year? With nuts gross margins (maybe getting close to $200B in gross margin?!!)

They've done this with almost no acquisitions (relatively speaking). And throwing off $150B in operating cash flow (6 months). I think stock buybacks are lame but what can they spend this on really? A moon launch and base?

This is the reward for being trusted (in my view) far FAR more than many other players in the market despite the complaints you see on HN.

I know they are considered a "ripoff" but if someone has a monitor / keyboard, dropping $700 for a mac mini, you can basically do all the programming / video editing / photo editing you could want.


Right... but the take presented in the article which is that Ive believed Apple should be innovating more in product categories and moving further and further into peoples lives through devices rather than milking their customers for subscription services.

I think it's a salient point that Apple has moved more into a phase (not permanent) where getting us to give them $$ per month is more important to them than creating the future.

I was relieved when things got a bit thicker, batteries got better, thermals got better, Apple Silicon came to the Mac etc etc but that's not independent of what the article states Jony was getting at.


The positive spin on the Services' focus is that Apple wisely decided to compete with the software companies on the phone (Netflix, Spotify, Google, Facebook, Twitter). The way they know how to compete, and how they believe the relationship should be, is that you give them money for a service. Even when the competitor is free, they charge us. This is an opinionated decision on their part NOT to offer free services subsided by intrusive ads.

I think my personal opinion is between what I just said and what you said.


I also like this.

I hate the "with ads" nature of a lot of services.

With Apple TV for example, you pretty much get... Apple TV.

I include the box and the subscription.

That much more expensive smart TV starts sticking ads all over the interface (ugh!). Same thing with Roku (those buttons etc). Yes - short term gain I'm sure for those folks, but long term loss of trust by users in my view.

Charge me, and do what I would want.

One problem with all of this - their service execution has been somewhat poor in my own view? Apple Card doesn't integrate with Quickbooks Online / Mint type products. I still find myself using Google Maps. iMessage is not cross platform (could I pay $1/month to get x-platform I message or something where others could message me).


I wonder how much of that shift to services was a necessity due to the pandemic. It’s very difficult to ideate new software products when everyone is wfh, I imagine that’s doubly difficult with hardware. Perhaps Apple’s drive to get people back in the office is because they’re worried the product pipeline has stalled.

> And throwing off $150B in operating cash flow (6 months). I think stock buybacks are lame but what can they spend this on really? A moon launch and base?

It would be great if they could spend a bit of that (dunno, 20B? 40B?) in fixing bugs and improving documentation.

It's not cool or fancy, but would surely help cement Apple as an even greater development environment, no?


NO question! Haha.. Even $2B more I think would go a long way :)

I'd also love to see attack surface brought down. For new messages from unknown contacts limit imessage payload to ascii. Yes. No emoji's no anything :) At least as an option. Etc etc. Re-write stacks on internet facing side if needed in safer languages? I'm kidding but...


Wanting to cut down two dozen trees to put up a white tent perfectly encapsulates everything I dislike about Apple.

I'm not sure that a $25 million tent & tree relocation really counts as a triumph for the accountants. Objecting to that expense seems fairly reasonable. The fact the Ive had to fight to get it done doesn't really tell me that the accountants have won out over quality product design. I certainly don't have any recollection of the trapping of the watch launch nor any coverage by vogue.

In general this conveys to me that Ive was growing increasingly out of touch with what was truly important rather than bean counters winning on $$$.


None

I want a phone that knows I'm going to put a screen protector on it and put it in a case.

So what if it's thinner? I'm going to put it in a case that makes it huge. Design it with the case in mind. You put a stupid camera bulge on it. The whole phone could be the thickness of the bulge.

Why do I have to put a screen protector on it? Why doesn't it come from the factory with a screen protector?

I don't know how to design a phone that knows it will be wearing clothes, but it should be obvious to designers that many people do not use a naked phone.


I don't use a case or a screen protector, but I know cases have a cutaway for the camera bulge so it doesn't actually increase the size. If you added thickness to the entire phone to remove the camera bulge then your case will be thicker too, it just means the camera will be 'sunken' into the case which would be a less efficient use of space. In fact the camera bulge disappears when you use a case.

I cant wait for Apple Glass. I wonder if Ive has contributed to that product.

In 2016, the Apple Watch was selling so badly that Apple was giving it away to its employees at an 80% discount. If any engineer at the company were responsible for such an epochal failure, he would have been tarred and feathered and then hanged, drawn and quartered. Instead, the manager responsible was allowed to let his RSUs vest for 4 years, and then let go with a $100m parachute deal.

How many billions of dollars do you think Jony Ive cost Apple with his bad design? $100 billion?

if he was the reason why MacBook Pro had that terrible keyboard or if he was the reason for removing the MagSafe connector, I say good riddance.

The intro story soured me on the guy. Millions of dollars to move a few trees for a party feels like a maybe waste of resources. That's not the sort of mentality I expect from design leadership on any company, even Apple.

And tbqh Ive always represented the worst of form over function to me. The puck mouse was ridiculous, as was a bunch of stuff he created or sponsored.

Good riddance.


Legal | privacy