Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Read the whole thing, I was really not convinced. Ive had some brilliant hits, but I think became more of a liability in the end. Look at what happened to the MacBook Pro, losing most of its ports and the thinness causing them to put a much worse keyboard in it that caused massive problems. Sacrificing a bit of thinness and going back on those changes with the newest iteration has been much better.

Honestly to me the M1 era of Apple is the more exciting than things have been in years. The article linked is really negative (saying Apple only have “legacy products”) but with the M1 series they seems to be smashing it out of the park…



view as:

I don’t think that was due to Ive but more so that he couldn’t excel without Steve around.

It's also clearly stated in the article that this was one of the reasons that Ive was given the CDO role. So that he could do less of what Steve did for him.


This. It seems clear Jobs served a “you made it very pretty but it sucks to use” role in feedback.

There are many counter-examples, from the original iMac’s mouse to the iPod Hifi. Jobs said no a lot, and that was a good thing, but he did not have absolute good taste. To his credit, he was good at learning from mistakes, even though he very rarely acknowledged them in public.

Don’t have to get every single one right to still play the role. The man was effective, not infallible.

But then it works both ways: you cannot cite a couple of failures under Cook to say that Jobs was irreplaceable.

You seem to be debating someone else; I haven't said that?

I think Jobs and Ive were a pair that complimented each other. I think when Jobs died, Ive lost that moderating influence, and "thin at the cost of good" and "we got rid of buttons" were the result for a while.


On the other hand, Ive without Jobs got you iOS 7 with all the fine Corinthian leather removed.

> Honestly to me the M1 era of Apple is the more exciting than things have been in years.

Yes, I feel like this is somehow still massively underappreciated. They pulled off a major hardware transition without big hiccups (I'm sure someone's going to point some I have missed in the comments /s) and launched a bunch of devices that are an incredible leap forward. I mean the baseline M1 air starts at $1k and is an incredible piece of hardware for most usage.


The M1 iPad Air is $499, couple that with Apple Pencil, the writing experience is near frictionless.

The M1 era is going to redefine the industry in subtle and not so subtle ways.


Intel’s failing will redefine the industry in many ways. ARM and AMD and other players are taking chunks out of them at the cutting edge. They seem to be redefining themselves as a “couture” fabricator rather than taking leadership on the design end of things … playing off their scale rather than their velocity. It’s a big change and probably has as much to do with why apple ditched them. Remember Apple did this before when they switched to to intel from IBM/Motorola, when they too had stagnated.

Stagnated? Weren't PowerPC chips pretty advanced compared to Intel whose chips were carrying a lot of baggage at the time? Given that PowerPC chips were based on RISC, I'd guess they're a lot closer to the M1 than modern Intel chips are.

My understanding is that IBM/Motorola's struggle with achieving volume is what doomed them not a lack of innovation.

This is all way outta my area of understanding though.


In general, IBM was just going in a different direction with Power than Apple needed them to be going in. IBM was and is focused on the highest end, high priced end of the server market.

If I recall, the scuttlebutt was that Motorola had promised Apple (meaning Steve Jobs) that faster clock speeds were just around the corner for a while, and when that repeatedly failed to materialize Apple (meaning Steve Jobs) got pissed and activated the Intel backup plan.

By the time Apple dropped PowerPC and went to Intel, Motorola was already out of the picture and IBM was making the G5

The G5 was the desktop chip, it was Motorola's task to scale it down for laptops, which they were unable to do.

That's a good point, thanks.

Sort of. The G4 chip was still used in laptops until the Intel transition, and was produced by Motorola until they spun off their semiconductor division into Freescale, which continued producing the G4 until the end.

Well it wasnt really clockspeeds, it was performance per watt. The g5 was able to go into a (watercooled) powermac but IBM couldnt get it to run cool and efficient enough to go into apple laptops (or the mac mini iirc). By 2005 intel was, at the very least, probably prototyping multicore (I dont remember if IBM’s processor offering to apple was multicore at the time) chips that blew ibm (and previous intel offerings) out of the water performance and efficiency wise and apple announced the transition.

The last generation of PowerMacs had dual-core G5s. They still ran very hot and it did not change much in the end.

Last generation had quad core with water cooling I think. They were really trying to get everything out of them.

It was two Dual-Core 2.5GHZ CPUs. I have one sitting under my desk right now. ;)

Pretty advanced for 2005. Four 64bit CPU cores, 16GB DDR2 RAM, and liquid cooled. It's still usable today 17 years later and it could work for 90% of what I do on the computer. It draws nearly 1000 watts under full load tho....


1000 watts, damn.

So what can’t it do? I’m guessing modern websites struggle on it


Yes, websites with heavy JavaScript really slow it down. The newest browser available for it is based on a very old version of Firefox.

https://github.com/classilla/tenfourfox/

Youtube is extremely slow, and it can only play back 360p or lower video smoothly. There is no hardware h.264 acceleration. “New” Reddit is also very slow. But old Reddit loads fine. Hacker news is very fast, loads as quickly as a modern computer.


It was driven by power and heat for the portable market. Intel x86 laptops were the best at the time and PPC couldn’t compete especially with thin and light.

> My understanding is that IBM/Motorola's struggle with achieving volume is what doomed them not a lack of innovation.

Before Apple announced their Intel transition laptops were more than half of their Mac sales. Of their desktop sales, the iMac dominated over PowerMacs. So a majority of the systems they were selling had relatively tight thermal envelopes.

Neither IBM nor Motorola was willing (or able) to get computing power equivalent to x86 into those thermal envelopes. The G5 was a derivative of IBM's POWER chips they put in servers and workstations. They were largely unconcerned with thermals. Motorola saw the embedded space as more profitable and didn't want to invest in the G4 to make it more competitive.

Meanwhile Intel had introduced the Pentium III derived Core series chips. Good thermals, high performance, multiple cores, and 64-bit. It was better performance than Apple's G5 in the thermal envelope of the G4.

Neither IBM or Motorola had general issues with production volume. Apple switching was all about the future direction of the architecture. There was no market for desktop PowerPC chips besides Apple. Neither IBM or Motorola really wanted to compete directly with Intel and saw their fortunes in other segments.

So Apple went with Intel because they were making chips compatible with what Apple wanted to do with the Mac. The first Intel Macs ran circles around the PowerPC machines they replaced with no major sacrifices needed in thermals or battery life.

So Intel innovation in the 00s got Apple to switch to them and a lack thereof got them to switch away again.


> Weren't PowerPC chips pretty advanced compared to Intel

PowerPC had better floating point performance which was important for graphics and publishing workflows. Photoshop performance comparisons seemed to happen at every year's MacWorld during that period.

Unfortunately, IBM used Power as a workstation chip, and making a version of the chips for laptops was not on their radar. Of course, at the time, Pentium IV chips weren't known for running cool either. The more popular laptops got, the more this was a problem.

After Intel transitioned to the Core architecture, Apple transitioned to Intel so they could make laptops with a much better performance per watt than PowerPC offered.


People weren't buying laptops for everything during the PowerPC transition. They were buying desktops. No one doing "serious" work bought laptops in 1994. Not for coding, not for photo manipulation, or even gaming.

It wasn't until the 20-teens (2013 - 2015) that Macs for coding caught on. Apple transitioning to PowerPC made perfect sense for graphics workstations.


>People weren't buying laptops for everything during the PowerPC transition.

That was the period where it became obvious that laptops would overtake desktops and become the most popular form factor for computers.

Neither PowerPC or Pentium IV were a good fit for laptops, but once Intel transitioned from NetBurst to Core it was a new ball game.

Apple even transitioned back to 32 bit for it, since Core didn't offer 64 bit support until Core 2 shipped.


Most students in my engineering programs had macs (2010-2015)

Right, but I didn't start seeing Mac laptops show up at work (software companies in the Midwest and Eastern U.S.) until 2014 when the MBP became a serious contender to Dells and HPs.

I had a Toshiba laptop in '95 for work, but it was a spare for when I was traveling. That pattern continued with Windows laptops as supplements for desktops in the office for the next 20 years. In 2015 my company went all MacBook Pro for everything, but they were trailing not trailblazing.

Students are always going be the leading edge, because they have to be mobile, they get used to it, then they bring it into the new workplace when they come. It's one of the benefits of hiring in new people.


The PPC processors may have been decent. But they were hamstrung by having to run an emulated 68K operating system and Apple cheaper out by having slower buses.

Perhaps in terms of sheer raw compute but across every scale from data centre to IoT and wearables.

PPC processors in wearables and IOT devices?

This take totally misses the mark on the realities of the situation.

Intel made a bad bet on tech and wasn’t able to shrink the node. TSMC got the right choice. TSMC was therefore able to make better tech in the short term. Intel designs are not directly related to that shortcoming.

TSMC makes way more chips than intel. TSMC is therefore able to buy the fabs, equipment, engineers, etc at a lower price per chip (since more chips). TSMC is therefore able to spend more on their fabs, and invest more in research. Intel can’t keep up on the manufacturing side even if they can on the design side. The only way to justify the research costs and fab costs is to amortize it across more chips which means they need to manufacture for more than just intel. It’s basically the AWS model - you can be your best customer, but you can drive prices down for yourself with extra customers. Amazon didn’t abandon the revolutionary 2 day shipping when they became a data center provider. Assuming intel still has good designers left, they won’t abandon their own chips.


Multiple reports over the years indicate that it's not just a single bad bet that got Intel into this, but a corporate culture where engineers have less and less influence and MBAs more and more, leading to worse and worse tech. Similar to Boeing.

Yet Intel is still competitive to AMD, and now has an engineer in charge. Intel's problems feel pretty over stated their products are still good, they're launching new ones which will bring new competition to the GPU space.

I’m personally quite excited for what intel has planned and I work on M1 chips (opinions are my own). I think there’s a decent chance they’ll have a comeback in the next few years. Yes, AMD is doing awesome and ARM is bigger than ever. We’re definitely headed into a very interesting time for processors as consumers.

>. The only way to justify the research costs and fab costs is to amortize it across more chips

or make more money per chip, which is what intel does, since lets say compared to an AMD chip TSMC manufacturers, TSMC takes a cut of the chip for manufacturing and then AMD takes the rest, while intel collects both portions.


In return it also has two sets of R&D to support and two sets of risks - architecture and manufacturing. If it falls behind on either of these it starts to lose.

TSMC for example can solely focus on manufacturing assured that it will fill it’s fabs if it keeps pace.

Maybe Intel made super profits when x86 was the only game in town but that’s not the case any more.


>If it falls behind on either of these it starts to lose.

If TMSC falls behind in one of these they lose, and they don't have the other. Is that an advantage as you seem to put it? If they make a wrong choice like intel did for 10nm they're going to be immediately a non-entity with no 'other' business. Having two sets of money make businesses puts intel at a big advantage in terms of financing and owning their own platforms.

If TSMC falls behind a node all of their orders will disappear to whoever has a more advanced node. They don't have another business. Instead of two risks, they have one risk thats identical to intel's, and their entire business depends on it. That's a lot less anti fragile.

Intel has two sets of risks and in exchange on many many fewer chips they basically made the same amount of money last year, when they were behind on CPUs at almost every metric. That's resilient. People talking about the fall of intel are talking about something that intel is actively maneuvering ahead of. TSMC has no chip design risks much lower per chip profits in exchange.


Vertical integration is great if it generates synergies. It’s really bad if the tie into the internal customer hinders the development of each part of the business.

Intel is not remotely robust as it’s almost completely dependent on x86 and needs to catch up with TSMC. It lost smartphones in part because of x86. Now it’s fallen behind AMD because of manufacturing weaknesses. Hence a P/E ratio of 9 vs c20 for TSMC.

TSMC on the other hand has a huge variety of customers at a wide variety of nodes.


Intel never had smartphones to lose. Inetl can generate all the same advantages as amd by simply buying Chips from TSMC if they want (and they already do for some chipsets) so there is no operational disadvantage. Intel is already mostly caught up to amd and will be making significantly more per chip than amd very soon.

TSMC is competing for something Intel doesn't want to sell. Intel even when it was in the lead wasn't tabbing it's newest process for 3rd parties. You're declaring TSMC Victor in a game Intel never played. And in a handful of years if Intel gains back process advantages you will likely still declare them the loser for not playing the cutting edge fab for other companies game they don't want to play.

TSMC is not playing the same game Intel is, and in 2021 when by all accounts Intel was behind TSMC and and, they still managed to make similar profit to TSMC and laugh at AMD's inability to buy enough chips to make anything close to competition for either Intel or Nvidia.

Now they're also getting into graphics cards and have largely caught uP with amd designs. Their future is bright.


The Intel graphics offerings have consistently under-preformed the rest of the industry, ending up at low end to midrange GPU: https://wccftech.com/intel-arc-alchemist-a770-gaming-desktop...

Intel has consistently tried to build a top tier GPU and failed year after year. Expecting them to suddenly break away from their history is extremely optimistic.


They're just starting to get into the business. It will improve. They don't have to have the best cards, they just have to compete in some segments from the start. It's all upwards from here.

That's the thing about experience - you keep accumulating it.


“This take totally misses the mark on the realities of the situation.”

No, this comment misses the mark.

Intel’s largest issues are not economic or technological.

It’s the bloated bureaucracy that squanders the best and brightest money can buy.


>Intel’s failing will redefine the industry in many ways. ARM and AMD and other players are taking chunks out of them at the cutting edge

Failing? Have you looked at Intel's 12th Gen CPUs? This trope was valid till the 10th Gen 14nm++++ era from 2019 but you might have overslept the last couple of years. Intel has improved massively since then starting with 11th Gen and Xe graphics.

Intel's 12th gen big-little tech really shook up the market and even AMD now is feeling the pressure.


Not to mention, they intend to compete with Apple on transistor density before 2024. Time will tell how successful they are, but I do get a laugh out of the people who are counting Intel out of the game right now. Apple doesn't sell CPUs, they sell Macs. They aren't even competing in the same market segment.

11th gen Intel chips were still 14nm, the top chip had less threads than the 10th gen because of the thermals, and intel xe was, iirc, only offered with the 11900/11900k (i.e. the top of the stack). Intel has had a stranglehold on integrated transcoders for a while but AMD’s integrated vega cores (soon to be RDNA2) still wipes the floor with current integrated XE offerings gaming wise…

>11th gen Intel chips were still 14nm

Nope, 11th Gen was 10nm. You might be confusing it with 10th Gen which was a mix of 14 and 10 nm.


Maybe the mobile products were? Desktop was 14nm:

https://ark.intel.com/content/www/us/en/ark/products/212047/...


Intel adding big.LITTLE ten years after it appeared in Arm is an interesting development.

One could argue it took ten years for Intel to have enough competition from ARM to actually wake up and do something again.

I don't care, I got a 12th gen i7 with integrated graphics (in the weird time window and edge case where Intel was ahead of AMD again for a bit) which is super fast and was way better priced then Intel used to be.

Comperition is good for consumers.


Agreed - I think it’s indicative of a less insular attitude which can only be positive.

Well, desktop and laptop PCs didn't have the extreme power constraints that mobile devices had.

So why are Intel using it now?

The goal posts moved since then.

Which goal posts? Competition from Apple?

No, consumer demand. It takes years to design, test and prepare for manufacturing a new CPU architecture, so Intel had their big-little in the pipeline long before Apple came out with the M1, same how it took Apple over 10 years of iterations to get to M1.

The real question is what is AMD gonna respond with?


It’s a strange argument that customers didn’t want better battery life from their laptops until now.

All credit to them now but lagging 10 years behind Arm in having this is not impressive.


I think you're mixing up some things. I couldn't just buy an ARM chip and plug it in a desktop PC or laptop and plus, the ARM chips, big-little or not, we're terribly underpowered 10 years ago compared even to Intel Celeron.

So calling it 10 year lagging because of a feature that had no relevance in the PC space back then is a misrepresentation.

Big-little made it to the PC market now since modern CPU cores are powerful enough that even low performance ones can still run a desktop environment and browsers well enough without stutters. That was not always the case 10 year ago, so consumer demand was always optimized around maximizing raw performance.

So, the fact that ARM had this feature 10 years ago is larglely irrelevant for this argument

It's ARM's performance improvement at the top end in the last 10 years that changed the landscape for the PC industry to a degree, not big-little.


> Apple did this before when they switched to to intel from IBM/Motorola,

Apple + (Intel) "Core" (geddit)?


> M1 iPad Air is $499

$599, actually.


That's a lot of money to pay for a device that locks you into the App Store.

A lot of money for whom?

To be fair, the iPhone does as well and is way more expensive than that

No it's not. Even the multi-core era failed to do that with laptops. If you're going to redefine the industry you need a radical new idea, not just 'the same, but slightly faster, slimmer, and lighter.'

Whatever will redefine the industry will probably be laughed at and only adopted by nerds for a while. Like OS X back in the day. Only people interested in the first couple versions were Unix nerds. Everybody else's software they needed was on OS 9.


Yeah I was one of them.

Unfortunately the Unix part seems to be very underappreciated by Apple recently so I've already moved on again. I was an early adopter of macOS and have 'converted' many more mainstream users that still use it. But for me it's become too locked down.


Which OS/distro did you move onto?

FreeBSD with KDE

Nixpkgs on Fedora w/ i3wm

Windows + WSL for open source stuff and PopOS for work.

Isn't WSL2 still IPv4-only? That's why I uninstalled it years ago.

Yeah, but you can use a custom kernel and WireGuard to get around that. https://withinboredom.info/blog/2022/04/02/finally-getting-i...

Amen to that. At work one of my responsibilities includes maintaining bootstrapping scripts for MacOS so we can reliably develop on the platform. Getting things to "just work" the way they do on our deploy servers is an actual nightmare, especially once you toss Apple Silicon into the mix. Not only are we running different kernels, but different architectures; it's simply impossible to guarantee that something that runs locally will work fine on production, or vice-versa. I definitely do my development on Linux where possible.

What is it too locked down to accomplish? There are many knobs to unlock it.

Changing the sshd_config to only accept key authentication for example. Since the recent locking down of significant parts of the OS this keeps getting reverted to default.

But there's many more issues, I've gone into them before (I used to be a Mac admin) but I don't want to bring it all up again


You sort of can sidestep the issue by supplying your own launchd plist for openssh, and disabling Apple's one, but it's a thorn in the side anyway — the fact that you even need to bother to sidestep the issue in the first place, while there are systems which go to great lengths to respect your changes to the configuration.

System level environment variables would be nice. It’s a pita to use a yubikey (arguably more secure than a plaintext key in .ssh) for ssh key storage. I remember having to start certain UI programs from a shell just to run them because they needed SSH abilities.

There has been a very nice trend for a number of projects to support a <config_file>.d directory to which local modifications can be added.

Current macOS (and Debian >=11) has a non-standard sshd_config modification that does "Include /etc/ssh/sshd_config.d/*". Although placement early in the config file means some things cannot be overridden.

Current "sudo" on macOS also supports this via "#includedir /private/etc/sudoers.d". (the # has been swapped for @ in upstream sudo).

This neatly sidesteps the need to diff / re-apply changes on a SW update.


This is not always an option, like you say it depends on the cooperation of the base config files (having the include and in the right place) and the tools used.

It won't work foor all cases either. I just want to have the ability to make modifications and sign or bless them somehow with a system admin key. Root is not enough, for understanable reasons. What is possible is to modify offline (through recovery) and then 'bless' my changes. But this reverts after every reboot.

There should be a toolchain where I can make legitimate modifications in a secure manner to system files. Like every other OS has. There should be some kind of user key to sign modifications with. Apple has just ignored this whole toolchain and replaced it with a "just trust us" blanket.


Theres also a lot of commercial service overhead, like apple music starting up on boot (and even being installed) and asking if you’d like to subscribe or whatever.

15 years ago apple had people on stage bragging how there was only one version of OS X while Microsoft had countless versions of Windows (Vista Home, Vista Pro, Server, etc). I wonder if there should be a standard MacOS and a MacOS Pro that would be a relatively stripped down unix environment without all of the bloat thats been added on to MacOS recently…


Absolutely not. There should just be easy switches.

Out of curiosity, what do you mean by “too locked down”? What could you do on, say OS X 10.8 that you cannot do now?

I am still running and compiling the same open source software as I did 10 years ago and more besides. There have been a couple of rough transitions with the new security things, SIP, and whatnot. I disabled it for a couple of releases but now that’s not really a problem.


You have an increased number of hoops to jump through if you want your computer to be programmable.

At first, it was Gatekeeper. Yeah, appeared in 10.8. Then notarization. Now, on M1 you need to sign your binaries ad-hoc or they won't run. Custom kernel extensions? No way.

It's like slowly boiling the water in which you sit. Little things, but one by one they accumulate into quite a lump of red tape, and Apple seems to drive home the point that a developer is a different caste than the user is, and there's some sacred knowledge that you should be in possession of, and update it every year for $99, so that you can program your computer. All the while the user is supposed to be clueless and in awe from this technological marvel the caste of engineers is bestowing unto them.

Oh, and Apple wants to be a bottleneck in each and every process connected to programming their computers. They also remind you that the machines you pay mad bucks for, you don't really own.

I like the pure Unix approach more, when the line between using the computer and programming it doesn't really exist, and where the system is basically your IDE too, and where you're going from being a user to being a programmer and back without really noticing. Mind you, it doesn't mean you have to go from one to the other, but when you want/need to, it's damn frictionless, and the system is as malleable as you want it to be.


Everything you list makes the Mac more secure and more stable.

For instance, with system integrity protection, a bad browser installer can’t wreck your entire computer.

https://arstechnica.com/information-technology/2019/09/no-it...


I like to be in control precisely over how hardened I'd like my system to be.

If I wreck it, I know how to reinstall it and restore my backups, thank you very much.


And you can do that. Just turn it off.

I honestly thing the 'lock down' is so overblown.

Yes there's 'more hoops' - but you go through the hoop once. Seriously, if you're running a dev machine turning off 3, maybe 4, things once and never touching them again is hardly the biggest hurdle.


It's near-impossible to brick an M1 machine. You can always reinstall from a second machine using Apple Configurator.

This is actually an experimental advantage over PCs, which you can brick if eg you erase the BIOS and backup BIOS.


The problem is, not everyone has a second machine. Not being able to install from USB or internet is really annoying from a support point of view.


Try looking at it as a solution, not a problem. In dire straits you can actually recover the machine, vs having no other option like GP noted.

Normally, and for normal users, the recovery mode is just a startup key combo away to re-image the machine.


You know, all it takes to recover a hosed system on x86 is a flash drive. Because the bootloader on those machines doesn’t have to be a specially made macOS partition with a slimmed down macOS on it (hah! Those people are calling Grub2 bloated!) which must live on the internal storage.

Moreover, on x86, even if the internal storage is hosed completely, I can boot the machine off USB/Thunderbolt and have it live for years more. Try that with dead SSD in your new macbook. Talk anout e-waste problem and how „heroically” Apple is fighting it, too.


You can't boot it if you restored the BIOS wrong - it's usually on a writable NVRAM. The M1's initial bootloader is on immutable storage.

I muck with the OS and above all the time, and I somehow can’t remember I ever needed to restore my bios. No. Never. Really.

But M1 macs need the internal storage to work and be intact to boot even from external media. If the internal ssd on my intel mac or dell xps tablet (the soldered one, yep) dies, I boot from usb3.1 and keep on keeping on. The M1 Mac is a brick after that, except the new mac studio where the ssds are somewhat replaceable.


Understood, but I can't wrap my head around why they removed the internet recovery option. Until very recently I managed a large fleet of Macs and it's already happened twice that a user managed to break their system so bad the builtin recovery wouldn't work. Both didn't have another system to hand to do the DFU thing either. Internet recovery as it existed on the intel macs would have saved them a trip to the office.

Different people have different annoyance levels with security restrictions. Personally, I'm all right if Apple's security model makes things I rarely do -- e.g., install privileged system extensions like Rogue Amoeba's Audio Capture Engine -- difficult but still possible. I understand why other people might make different choices.

Having said that, I do roll my eyes whenever I come across the phrase "walled garden" when applied to the Mac in particular, especially when people stridently insist that the Mac is just a year or two away from being locked down like iOS. (I've been hearing that prediction for over a decade, and find it less likely than ever in the era of Apple Silicon Macs.)


They have been warning about Apple requiring all Mac apps to come from the app store since 2011.

If apple wants to ban me, specifically, from running software on my M1 computer now, they can do so. If China or the US government says so, apple will probably comply. You are completely dependent on a network connection to apple to be able to run an M1 now.

If I want to make an app on my iPhone that I don't want to publish, I have to reinstall it every week, and can only install apps with network connections to apple, as apple gives my phone another 1 week permission slip to run code that I have written.

There are no more offline updates, no more offline app installs.

Also apple cares about privacy, except for privacy from apple. They transmit a shit ton of info all the time from their devices to the mothership and know effectively when and where you have been running apps on their computers constantly. They also do so unencrypted in some cases so anyone spying on the network can know too.

You are not the owner of an apple computer anymore, it's apple.

Ultimately in the end, if they really cared about giving their users ultimate ownership of their devices, they would. It would show up in the form of corporate MDM servers which make the ultimate certificate authority the corporate MDM server owner, and in personal cases you could launch and run your own or use Apple's.

Apple hasn't. They are game console computers and macOS is effectively legacy at this point compared to iOS.


How would Apple ban you from running apps on a Mac?

If the Mac were a legacy, why are they spending so much effort on the Mac to bring them all to their own processors specifically designed for them?


You should thank those people. They made enough noise to prevent what was and is surely apple's long term plan.

Yes that long term plan hasn’t happen in a decade since people were predicting it with the introduction of the Mac App Store in 2011.

Any day now…


Still, it kinda is. You really have to go out of your way now to have full access to modify system files and even then you're not able to do just anything you want. Think of installing another OS on the SSD on Intel Macs with T2 chip, or choosing which iOS apps you want to run on M1.

So you have to be purposeful and know what you’re doing to potentially corrupt your Mac…the horror.

Have you ever thought by making it hard for you to corrupt your Mac it also makes it hard for malware?

Apple fully supports installing Windows on x86 Macs and there are plenty of guides on installing Linux on x86 and Mx Macs.


That logic is a bit circular, though, and not very convincing. Apple is known for being opinionated and stubborn about their long-term goals. If they really wanted to lock down MacOS, they’d just have done it, developers be damned.

Or, that's just part of their sales pitch. You know, like how politicians dont like to be seen as wishy washy, its very likely Apple responds to public opinion just as much as anyone else.

Apple made no further moves to “lock down” iOS to force people to use the App Store after 2006. If Apple listened to public opinion, the iOS App Store wouldn’t be the shit show it is today.

A lot of people look wistfully back on the good old days of futzing around with drama in their PC.

Time moves on. If you want computing to be an adventure, that’s what Linux is for.


I hazard yo ask: have you tried Linux recently?

Apple wants to produce customer appliances, not entirely but mostly locked down in the name of security and smooth customer experience, and there seems to be a large market for these.


That was exactly my point above. They've moved on to a new market and the Unix power user market I'm part of is no longer in focus with Apple.

As somebody who writes code most days, and is constantly downloading and compiling others' code, your comment doesn't sound like we are in the same platform, even though I also use macOS. I wonder why our experiences are so different.

For example, I don't have a $99 developer certificate, and am not sure why I would want or need one.


Try and distribute binaries (not source) that other people can run (not compile) and you'll quickly find out why.

There's no overhead to ad-hoc signing. The linker does it by default and it's "ad-hoc" - it's literally a checksum, not a secure signature.

You might complain about MAP_JIT but that's pretty important for security.


The developer/user caste split is not unique to Apple. That's just how every computer has been used ever since we stopped booting (micro)computers straight into ROM BASIC. The moment you have people using software they did not develop, you create the opportunity for malware; and once you have malware you have OS vendors trying to lock it away from you.

FWIW the biggest hoop an ordinary user would ever have to jump through on macOS is right-clicking an unsigned app bundle on first use to authorize it; which I've had to do on plenty of FOSS apps. This is not a problem, IMO - it is reasonable for the system to say "hey wait a second this app is new or different" and give you the opportunity to check twice before you jump. Code signing and notarization are things you only ever need to worry about if you want to distribute apps and not make your users right-click. Windows has similar problems with SmartScreen, except you can't even buy a code signing certificate to reliably make the problem go away, and Microsoft does an arguably better job at hiding the "no seriously this is just a FOSS app that hasn't jumped through all your hoops yet" option than macOS does.


The problem with how Apple is treating FOSS apps is in that it’s using scary messages like „this could be a dangerous malware!” every time, all the time, if you want to distribute your software without paying Apple for what amounts to protection racket.

Which leads to two outcomes: either you learn to right click everything and one day it’s going to be malware, too, or you „spctl --master-disable” and make the system turn a blind eye, and a drive-by zero-click kind of malware finds its way in.

Or, if you are not really familiar with intricacies of software distribution in Apple’s ecosystem, there’s a third outcome: you believe Apple and for you, all FOSS is dangerous malware.

Mind you, UAC nagging on Windows has the same problem, and sudo conditions you into just giving your computer password away left and right. I don’t know if a good solution exists, except maybe that you have to learn to not let your guard down in certain situations.


> You have an increased number of hoops to jump through if you want your computer to be programmable.

> At first, it was Gatekeeper. Yeah, appeared in 10.8. Then notarization. Now, on M1 you need to sign your binaries ad-hoc or they won't run.

You are confusing the local software development with the global software distribution.

Anyone is free to install development tools, compilers and toolchains, compile and run anything locally on an OS X, just like on any other UNIX system, including Linux. Gatekeeper and the notarisation enforcement, which are a default but can be neutered, address a common use case of «my mom/little sister/grandfather/etc has downloaded and installed a seemingly useful app from a dodgy website that has excreted a ransomware that now demands a payment to give their data back, or leaked all of their contacts/personal data, or slowed their computing device 100 fold down». OS X does give lay users a way to turn checks off, including the system integrity protection, and shoot themselves in the foot if that is what they want.

In fact, I would like to be able to selectively apply an even more, higher, restrictive security profile to any app, including to the ones available through Apple's own app store. Who can tell what potential spyware / trackware a 500+ Mb WhatsApp download from the official Apple app store contains within? WhatsApp desktop client is a mediocre (feature wise) messenger that is not even as feature rich as the 85Mb Telegram desktop download is. And I, for one, don't have the capacity to reverse engineer and manually inspect every single WhatsApp download/update for my own peace of mind and sanity, and I would also presume that very few people even from around here can. Anything coming out of FaceBook nowadays should be considered trackware and needs to be airgapped with the utmost diligence.

Even the business/enterprise software is ridden with all sorts of trackers. Citrix Workspace client, for example, installs three "telemetry" helper processes that phone home non-stop.

> Custom kernel extensions? No way.

Yes way. They just have to be signed for same reasons as third party apps. See https://support.apple.com/en-au/guide/security/sec8e454101b/... for details. I was recently scrubbing the file system off some legacy components and noticed, with a lot bewilderment, what extra 'useful' USB drivers a 4G MiFi device I had purchased over a decade ago installed at the 1st connection attempt. I, for one, don't need those 'niceties' to sneakily creep into any of my file systems or lurk around.

In fact, OS X has become more GNU Hurd since 10.15 like by pushing kernel extensions into the user space instead, which reduces chances of the kernel crashing due to a bug in a random extension. More kernel stability and security? I say yes to both, albeit your mileage may vary.


The thing is, you cannot harden it even more. If Apple Almighty hasn’t thought about your use case, it may as well not exist. There are two modes: either Apple nannying your systems like the overly attached yiddishe mame, or a mad ex breaking up with you, but also ripping out the locks from your doors before they go.

Which is my freaking point exactly: you are being constantly reminded that despite having paid mad bucks for the hardware, Apple still owns it, snooping over your shoulder, giving you slaps on the hand if you dare „misuse” what they gave you.


> The thing is, you cannot harden it even more.

Yes, you can. You seem to have never used Trusted Solaris implementing the B1 TCSEC level and running on the certified hardware where all system objects are labelled and MAC is enforced at the kernel level. That is an example of a much more locked down system, which you are conjuring up OS X to be but the OS X is nowhere near to being that locked down nor controlled.

The real trouble with the system hardening, however, arises at the network level where the traffic is fully encrypted, and it is impossible to tell whether a network connection, say, WhatsApp has established to 'external.fbsb1-1.fna.fbcdn.net' or to 'scontent.fams1-2.fna.fbcdn.net' is a legit one, or it is used for tracking the user. It is impossible to harden that further and is a unfortunate side effect of the use of the encryption.

App sandbox profiles can still be hardened further by anyone, not just by Apple at a whim; see https://reverse.put.as/wp-content/uploads/2011/09/Apple-Sand... and https://paolozaino.wordpress.com/2015/10/20/maximum-security... for details.

Otherwise, since OS X is a UNIX implementation, therefore it can be scrubbed off of undesired features, if a need be:

- SIP can be disabled.

- Gatekeeper can inactivated peacefully (or via brute force by setting appropriate host entries to 0.0.0.0 in /etc/hosts).

- Deemed to be illegit or nuisance services can be scrubbed off via /System/Library/Frameworks/CoreServices.framework/Frameworks/LaunchServices.framework/Support/lsregister.

- The rest can be unloaded and disabled via the routine application of launchctl.

- Security policies can be amended via fiddling with spctl.

Yes, all of the above is tedious, and it will break all vertically integrated Apple services and official app store updates, but it will give one a sufficiently 'clean' OS X install. If that is not enough, one can install Linux, NetBSD or OpenBSD to run whatever they please.

> If Apple Almighty hasn’t thought about your use case, it may as well not exist. There are two modes: either Apple nannying your systems like the overly attached yiddishe mame, or a mad ex breaking up with you, but also ripping out the locks from your doors before they go.

> […] Apple still owns it, snooping over your shoulder, giving you slaps on the hand if you dare „misuse” what they gave you.

You are misrepresenting a unsubstantiated strong personal opinion or a vague, generic, hand waving for a collection of facts that do not support such an opinion, for none of that actually exists. Apple is certainly not:

- Baby sitting my own or my colleagues OS installs; nor

- Does it slap me on the wrist for running anything I want out of my own moderately wild zoo; nor

- Does it care about the network traffic that passes through.

And Apple certainly does not own the laptop I have purchased from them; you might have a valid case with iPhones, iPads or Apple TV's but personal computing devices produced by Apple are not an example of such an instance.

Lastly, what might be a bug or a misfeature for you is a feature or a conscious compromise (that one has willingly accepted) for some.


This is the point, yes. You can turn most of it off, but then even regular software upgrades break before you even do any modifications. Running an outdated system is not viable in this day and age.

Part of the reasons for locking down is 'security' (though a lot of it is so easy to bypass through Apple's own backdoors that it seems to be more form than function). Which makes some sense from the user's point of view. Part of the reasons is DRM for Apple's services. Apple TV, Music, App store, iOS apps. I don't care about that but there is not much I can do about it on their hardware. Soon we'll have a new category Apple will want to lock down the system for: the CSAM detection. Of course Apple will be hell-bent on preventing people trying to tamper with that. It will come with its own host of protections, which I bet will even be active when SIP is off.

Right now there are already a bunch of folders which the root user can't even read. Try running a "find / 1>/dev/null" as root and you'll get a nice list of folders Apple considers too sensitive for the user even to look at.

The problem is not security. The problem is Apple enforces security without the user being able to override their choices. There is no way to make modifications to system files and "bless" them so it will still boot after the next update. You have to do it every time. There's no UI for such features at all. Apple forgets that I'm the owner and admin of my system and doesn't give me the keys to manage it.

It's fine for their new targeted audience which is mainstream users with a large disposable income. It's not great for power users with a unix background like myself. I lament that macOS has moved away from supporting my usecase because I did enjoy it for many years.

But now I'm happier with FreeBSD with KDE. I really enjoy being able to trace a problem right down to its cause again. I love all the customisation options again.


> At first, it was Gatekeeper. Yeah, appeared in 10.8. Then notarization. Now, on M1 you need to sign your binaries ad-hoc or they won't run. Custom kernel extensions? No way.

I don’t know, from my experience just building and running works fine. As hoc signing is something the tool chain does. I don’t write kernel extensions so I cannot really comment on those (though I have a couple of them installed as a user and there was no significant hassle). The worst annoyance I’ve seen is having to authenticate every now and then to attach a debugger to a process.

> there's some sacred knowledge that you should be in possession of, and update it every year for $99, so that you can program your computer.

Programming your own computer has no red tape, the difficulties start when you want to distribute binaries, if you don’t want your users to have to right-click once. You can get compilers and things from xCode, Homebrew or MacPorts and they work without you having to pay anything.

> I like the pure Unix approach more, when the line between using the computer and programming it doesn't really exist, and where the system is basically your IDE too, and where you're going from being a user to being a programmer and back without really noticing. Mind you, it doesn't mean you have to go from one to the other, but when you want/need to, it's damn frictionless, and the system is as malleable as you want it to be.

Yes, it’s nice. But my ose of MacOS and Linux are not very different in that regard (true, I spent quite a bit of time customising xfce, which was fun). Also, to be a bit realistic it does not really work with end users in general. For a mass market OS, the fewer footguns the better.


For NeXT the UNIX compatibility was a means to compete in the workstation market, nothing more.

"Why We Have to Make UNIX Invisible."

https://www.usenix.org/blog/vault-steve-jobs-keynotes-1987-u...


I stopped using Apple products when they started spying on everything I did. Do I really want Apple recording the name and hash of every program I run? Why ask permission and click "agree" to write code for my own use? Checked right out of that ecosystem and I won't be going back.

windows11 with wsl2. sometimes i forget i'm not in kde.

You probably thought the original iPhone was pretty stupid, too, I would guess.

Didn’t everyone recognize the first iPhone as a revolution? I mean, the entire Android team took a day off, knowing they had failed.

No, there was widespread derision. The idea that people would accept a phone without a physical keyboard was nothing short of heretical in business circles.

Steve Ballmer famously went on one of the popular morning TV news shows and laughed at the iPhone. The fact that he still had a job when he got back to Redmond explains a lot about Microsoft's stagnation under his leadership, and its subsequent return to a successful path once he was gone.


My memory is that MS laughed because they did not believe the hype. The laughing stopped when they got the first iPhones in house and were able to see how much space Apple was able to dedicate to the battery.

Not really. I had a fairly recent Treo at the time. I certainly didn't buy an iPhone when it first came out. Come the 3GS I was definitely ready to go with Apple but it wasn't an instant switch.

Of course, I was also not an Apple customer at the time except for an iPod sometime around that time.


I was also a die hard Treo user at the time but as soon as I saw the iPhone it was obvious this was the future and I got in line on release day to get one and never looked back.

When the original iPhone came out, it couldn’t run apps and didn’t have GPS. Capabilities that my Blackberry and even feature phones had.

The iPhone wasn’t really good until the iPhone 4.


My first iPhone was the 3GS and I was wowed by it (having used internet connected pagers and then phones since ‘97 (if you count SMTP with my various procmail filters as internet connected) and I was totally blown away. My memory of watching a Lego pirates YouTube movie in my living room with nary a cord or keyboard still has a bit of awe attached to it.

Don’t get me wrong. I bought an iPod Touch the first gen as soon as iOS 2 was released in 2008. I also bought an iPod Touch 4th gen when I was trying to avoid AT&T. I finally bought an iPhone 4 on Verizon.

The original iPhone /was/ stupid until they relented and decided to allow 3rd party apps with the iOS 2 update.

You say that but wasn't the original vision the one that was adopted by google since? Everything in browser?

I didn't say it wasn't. I'm pointing out the parallel that some people had the vision to see that the iphone would _someday_ be game changing. Others felt smugly superior because they were blind to a future that was obviously very possible in retrospect.

The radical bit is they deliver holistic software and hardware that actually works.

No other vendor comes close.


I find the Google Pixel line fits that description pretty well, but very differently. I prefer the software over iOS, and hardware is good enough, although not nearly on the same level as an iPhone.

> Whatever will redefine the industry will probably be laughed at and only adopted by nerds for a while.

I suspect that virtual reality and augmented reality are in this classic position. Laughed at, but totally adored by a group of nerds. As someone in that group of nerds, the feeling is so much like the early days of the web -- it really is incredibly exciting. It amazes me that anyone can be pessimistic about it, especially given the astounding progress and investment being made. In my view, it's deeply obvious that augmented reality is the next paradigm shift we're all waiting for.


Even if the GP claim was true, it does not follow that every thing that is being "laughed at and only adopted by nerds" is the next big thing to redefine the industry.

Augmented reality has some nice properties in a number of contexts. In others, it's totally irrelevant. Whether someone will come out with a product that will make it irreplaceable is a different matter.

It's like touch keyboards. Sure, they are better than phone keypad-style keyboards for entering text, but actual physical input still performs better. They are a different paradigm that allows device manufacturers to provide a more often desired feature (larger screens), but while they are winning over in places where they do not fit (like in-car infotainment systems), almost everybody prefers physical knobs for a number of features (like climate control). This will likely lead to a reversal of the trend.


I didn't say that every thing being laughed at is the next big thing to redefine the industry. I only predicted that augmented reality really is one of those things. Further, the person I was replying to claimed that the next big thing to redefine the industry is something that will be laughed at. Neither party claimed that everything being laughed at will redefine the industry.

The statement that augmented reality is "like touch keyboards" is a statement that you haven't given justification for. Augmented reality is not a technology centered around replacing physical input with non-physical input, and so the analogy is not self-evident. You might potentially have a good point, but you'd have to explain how this analogy is relevant to the technology of placing graphics and information at any position in a person's 3D visual space. Remember, most augmented reality objects don't need to be controlled at all to be useful, and you can still control an augmented reality screen with a mouse and keyboard.

I'm interested to hear more detail to your thoughts on why you think augmented reality won't be a paradigm shift. I'm also happy to elaborate on why I think it will, if you like.


The biggest issue with AR imo (or rather, why it doesn't appeal to me as a universal paradigm) is that it requires a device in your field of view to "augment" the reality.

Sure, a HUD in a car with AR is nice. A pair of binoculars or a telescope with AR is nice. Even a camera works. As long as the device is already "there", AR is a nice improvement (look through a telescope and it identifies stars or constellations for you) — provided it can be turned off for when you want to enjoy simple reality.

But otherwise, it's an unnecessary gimmick that you won't bother to use, simply because it's not universal (you won't get a pair of AR sunglasses for the day, and see-through glasses for the night, esp if you are not wearing glasses otherwise).

With a lack of universality, I don't see it as a "paradigm shift".

And there are orthogonal concerns like privacy. As the processing speed improves, privacy concerns will diminish (as you can have in-device processing), but just like with high resolution screens, this requires a lot of processing (visual data takes a lot of bandwidth, and while it's quadratic growth for 2D data, it's even more for stereoscopic or 3D data), that we are progressing towards very slowly.


The M1 iPad Air is actually $599 for the entry-level 64 GB, which nowadays isn’t that much storage. If you go for the 256 GB (there’s no 128 GB, for profit-maximization reasons) and add the Pencil plus maybe the missing charger, then you’re already at over $900.

What is the average person actually storing on their iPad, though? Photos taken with the iPad, maybe?

I think for most people it's just an expensive, nicer Chromebook. Everything they want to consume needs an internet connection anyway.

If you're a creative or just a nerd, then sure, you'll need to spend more money to get the specs you need.


Or photos/videos taken with the iPhone. The iPhone now starts at 128 GB, so it’s surprising they still start the iPad Air at 64 GB.

Downloaded videos for long flights! You have to get a storage bump to actually get a variety of content downloaded.

Also local copies of cloud storage are very valuable to keep on iPad.


One thing I've really appreciated on flights is the addition of streaming videos to your own device. If it saves fuel and maintenance costs, I'm perfectly fine with them ripping all the personal TVs out of the plane as long as they can keep a selection of movies available in case I forget to download my own and don't feel like reading.

I think you missed the “average person” bit.

The average user does know what local copies are.


Why wouldn’t they? Most of the popular streaming apps have download functionality.

Because they are busy talking to their mice.

Might want to check your dosage on the "omg no bundled charger omg omg the world is ending" memedrugs.

The iPad Air ships with a USB-C charger. A nice one.


Thanks for the heads up, can’t edit the comment anymore.

Is it less hot when writing?

I have the original pro and I won't buy another iPad again until it doesn't get so hot when taking notes.


I’ve had multiple iPads (though, admittedly, not the original Pro) and none of them has had a perceptible change in temperature when writing.

They've pulled off major hardware transitions twice before without hiccups. But this time they make the whole platform which is quite impressive imo.

Gosh I had to think back a moment to remember 68k to PPC. I wonder if that transition could be considered “botched” in that it happened at instead of going directly to x86. Outside hindsight I recall it was considered a questionable choice at the time.

It wasn't really though. Motorola was going nowhere. As to why not x86, that's another story but Intel has gone down the wrong path several times. Like with the Pentium 4.

Not really, PPC was a reasonable choice for a high-performance architecture back then, and arguably a better fit for former 68k coders than x86. And a move was necessary because the 68k was becoming a dead platform by then.

My recollection is that the PPC at launch was much faster than x86. Jobs talked about the road map a bit, and there was a lot of press about it too, but the road map didn’t pan out, and their partners dropped the ball. And many other companies made the transition to x86 (Data General was one I worked with) and subsequently died.

The biggest problem was Motorola/Freescale saw that their future was in embedded, not the (relative) high performance required for personal computing. So the chip provider for Apple's mobile lineup no longer was producing performant chips. Unfortunately, IBM's implementation of PPC led a dual life as a server/workstation and desktop chip meant that getting it's power consumption, heat profile and performance optimized for a mobile device was an extremely difficult proposition.

It would be interesting to see where we'd be if IBM had ensured Apple they could deliver a G5 laptop and had done so at a price and spec competitive with, or superior to, Intel.


> The biggest problem was Motorola/Freescale saw that their future was in embedded, not the (relative) high performance required for personal computing.

Ironically that's also the niche where PowerPC ended up when Apple dumped them :)


Yeah. The 601 was on par with the P54s and the 604/604e were a bit ahead of the Pentium Pro and Pentium II of the era. The G4 vs P3 and P4 is when the problems for Motorola and IBM eventually ran out of steam with the G5.

The striking thing about this time is that it is essentially transparent to most people. Probably more third party programs have been broken by security changes that Apple has made over the past five years than have been broken by the M1 transition. Yes, there are performance implications but M1 is sufficiently fast and most important performance sensitive programs are being quickly ported that it doesn't matter that much. (And, for most people, ultimate performance is mostly not a big deal on laptops these days.)

Transitive's tech, combined with processors that can afford some inefficiency, is pretty much magic to anyone who remembers what ISA transitions used to look like. (As a hardware product manager, I lived through a couple of them including a Motorola 88K to x86 one on Unix.)


I feel like more software died during the 32->64 bit transition than any other time in recent history. Lifting the rug is also par for the course for Apple, even artificially so with their latest crackdown on apps that don’t run well.

That mostly seemed to be video games (at least their developers complained the most), but I just had to wonder why they couldn't make their games 64-bit-ready with 15 years of warning.

Btw, 32-bit Windows binaries still run through WINE, just not 32-bit Mac binaries.


Games just don’t have a long enough tail to make this kind of effort worth it. They aren’t like business software where you try to keep people on the hook for years.

There’s little reason to keep updating a game past its first few post release patches. So you can assume most of their code bases have been untouched for years, were probably written by people who are no longer around, and haven’t been compilable by modern toolchains for a long time.

The tech debt is just insurmountable most of the time.


> but I just had to wonder why they couldn't make their games 64-bit-ready with 15 years of warning.

First off it wasn't just games. You can't run the old Microsoft Word anymore, for example. Why is this important? The old Word did everything I wanted. The new Word has fewer features than the old one. Also, I already owned the old one. Now I have to buy it again even though the old one was just fine.

For the games, I want to play my 20 year old games. Now I can't unless I fire up an old computer that is not upgraded and not on the internet (because it's horribly insecure). The companies that made those games don't exist anymore. And even if they did, it wouldn't be worth it for them to compile them again.

And on the iPhone I had a ton of awesome kids games. Those companies are also out of business, so again, I can't have the kids play those games unless I use an old insecure phone.

This is one area where Windows shines. For the most part you can still run 40 year old software on Windows.


The 68K to PPC transition was pretty good. But the operating system was running emulated 68K code for five years.

That was more or less intentional. The 68K interpreter was quite fast and 68K asm is smaller in memory than PPC asm, so converting all the cold code over would've made the system slower.

I'd like to see the same idea used today. One I can think of is C++ exception unwinding which uses a DWARF interpreter instead of generating tons of code to do it.


You’re looking through rose colored glasses. My PowerMac 6100/60 ran 68K apps slower than my LCII with a 68030-40Mhz accelerator. The original 1st gen 68K emulator was really bad.

Connectix became famous for shipping SpeedDoubler, a much better emulator. But my 6100/60 could still barely keep up with the 68030-40Mhz.

The emulator performed even worse on the PPC 603.


That could be, I think it was explained to me by an former engineer on Mac OS 8 or so. But there was still 68K asm in there up until Carbon.

It’s always interesting to me how Apple gets so much praise for taking things away, almost always needlessly with some made up excuse meant to sell more of something, only to later bring them back as if they’re some oracle of utility. Big surprise Apple, non-arbitrarily thin computers, usable keyboards, and ports are useful.

The M1 chips are nice. But Apple also continually throws developers under the bus forcing them along their deprecation strategies.


I used to feel like you do but nowadays I applaud that (while still not really using Apple products much due to the walled garden ecosystem) because it's really hard to take something away when you know it's good but could be better. For all the things that they've "taken away" over the last 10 years, it feels like the newer laptops are significantly better than they could have been if they just kept on adding and making incremental improvements like most other manufacturers.

No kidding. I bought a reasonably high end Thinkpad for myself and the entry level M1 MacBook Air for my son to do his schoolwork on. The performance, battery life and screen are much better on the M1 despite being significantly less expensive.

Hard disagree. At least from a business perspective. The transition to Apple silicon was likely more about long-term cost savings and their ruthless vertical integration. Apple appeals to customers (most customers, HN users are outliers) as a luxury brand and their market dominance is based on their brand more than their products. The products need to look distinct with high quality finish. Like Louis Vuitton bags or Polo shirts. Otherwise they'd be back to occupying the niche of the laptop for graphic designers.

One thing that is particularly under-appreciated is that Apple have pulled off a textbook disruption play against Intel; its supposed to be a small scrappy new company that disrupts the slow behemoth incumbent but in this case one of the biggest companies in the world has just executed it perfectly.

In this case of course having multiple products to graduate their own silicon through, and enough cash to design and build a chip, require you to be a huge company. But it shows strategic excellence and a long-term planning horizon to pull this off.

(Note I’m using the original strict definition of Disruption from Christensen, not the hand-wavey TechCrunch slogan.)


was intel really "disrupted"? The M1 designs aren't available as general CPUs for alternative uses other than on a mac.

They've taken a hit to their revenue by losing those sales. There's also the reputation damage of losing a prestige customer like Apple.

I work for a Fortune 500 company. About 70% of the employees has a MacBook Pro. Until recently, all of them had an intel chip inside. Going forward they will have an M1. We are on a 3 year refresh cycle. So within 3 years the majority of company computers will be running M1. About 90% of company phones are iOS. If Apple starts using M1 in those…

iPhones have been running Apple silicon ARM for a decade. The M1 chip is quite similar to the chip used in the 2020 iPad Pro. It's done.

I also work for a Fortune 500 company. Unless the employees are doing anything related to Apple ecosystem, they will be carrying Thinkpads instead, or using classical PC desktops, with external contractors having virtual workstations via the various cloud offerings.

Overall Apple desktop market across the world, is still around 12%.


Yeah. At tech companies and tech conferences, you get a pretty distorted view of the sorts of computers most people are using. Especially taking into account the fact that you probably also see a disproportionate number of people running Linux on Thinkpads in a lot of places, one might assume that Windows is barely used by looking at laptops at the typical event I attend.

I don’t work for a tech company. I work for a major sporting goods company.

I'm sure there are exceptions. Nonetheless, something like 80% of the PC market overall is Windows.

it's not really windows marketshare that matters - it's the instruction set. x86 compatible instruction set is dominant today.

If apple really want to disrupt intel (and i guess by collateral damage, AMD), they will release the m1 CPU as a commodity. but they will also have to figure out how to get microsoft to play ball as well (which i am not sure they will).


Apple is basically already using M1 in iOS devices. The A14 is basically to an M1 to what an M1 is to an M1 Pro.

I remember hearing that Steve Jobs had originally asked intel to develop a CPU for the iPhone (Apple had a close relationship with intel back in the days when they switched from PowerPC.) I remember intel being on stage at Apple keynotes and Apple also got "first dibs" on a lot of new intel CPUs back then. But intel dind't believe it would be profitable enough for them to pursue.

Apple had been screwed before back when IBM wouldn't make them a power efficient G5 chip for laptops. Then intel wouldn't make them a power efficient phone CPU. So here we are today, Apple does it themselves. Had intel made a phone CPU for the iPhone, the M1 might not have ever existed.


Yeah, Intel had the opportunity to avoid this outcome, and they fumbled on the mobile sector. Ben Thompson gives a deep analysis of the history here: https://stratechery.com/2021/intel-problems/

With the absurd efficiency gains Apple got, I imagine most premium Windows/Linux laptops will run ARM in 5 years.

For that, they'd first have to sell some. Outside of Macbooks, I'm not aware of a single premium Linux/Windows Laptop that runs on ARM. Until HP, Dell and Lenovo offer those, there won't be any uptick in ARM laptops outside of Macbooks. And most companies won't buy the first model running an ARM processor, they'll first want to see how driver support and virtualization of x86 apps (of which there are a lot in larger companies) work in reality.

The vast bulk of Windows laptops that go out the door go to companies that want them to be as boring as possible. This is probably the primary reason Linux desktops never made mainstream headway. Pretty much anything that increases support costs is largely a non-starter.

> was intel really "disrupted"?

Yes, Intel was disrupted. It was just disrupted by ARM. Originally the chips are too slow for normal use, but find a niche in low power devices. Over time the rough edges are smoothed out, and the underdog can make inroads in the high end marked while the incumbent isn't structured to compete in the low end very well, and their attempts fail.

> M1 designs aren't available as general CPUs for alternative uses other than on a mac.

M1 is just the capstone of that long process. This is sort of a wrinkle, but Apple's strategy means they can build a high margin product ("We don't strive to make the most popular product, we strive to make the best.") and not have to hand them over to by suppliers. Given the high margins M1 chip command when placed in Macs, it doesn't seem likely to pressure Intel

But make no mistake, Intel's margins are strongest on servers. Prices on Xeons are like 10x that for Cores. This is where the disruption is happening. Running macOS on M1 in AWS is neat but is probably for running build and test farms; Graviton is presumably the ARM chipset AWS customers might ditch Intel for. I've met teams that saved substantial money by switching, and that has to feed into demand for Xeons at some point.

The typical way a firm might survive an "attack from the bottom" is to chase after the high value compute. In 2022 that's AI / Tensorflow, where Nvidia rules and frankly, Intel underperforms. Hopefully Nvidia pays Mark Harris well because they probably owe him a few billion.


> Intel's margins are strongest on servers

Not anymore.

Intel CCG is at 9.3b revenue/2.8b profit this quarter (30% operational margin). Intel DCG is at 6b revenue/1.7b profit this quarter (28% operational margin).


Intel was "disrupted" by their own doing when they missed the bus on the mobile revolution and the GPU revolution. And the M1 is very much a niche thing. It's an SoC, not a CPU, which is what Intel mainly produces, so comparisons of M1 and Intel processors are bound to be somewhat flawed.

Definitely not the case that the full lifecycle of disruption is complete since that entails the full replacement of the incumbent. But with M1 Apple silicon surpassed Intel in performance after many years of inferior performance improving at a greater rate, which I think is one of the key milestones in a disruption story.

I’m not sure if there are any case studies of one behemoth fully disrupting another? So who knows what the end state will look like.

One confounder for predicting the endgame is that due to their vertical integration with their OS, Apple won’t replace Intel where Windows is a hard requirement, and so they probably won’t completely decimate Intel. I suppose in this aspect it’s not a full disruption.

I’m not really clear how much inertia there is behind Windows these days, now that MS is committed to Office on Mac. Definitely substantial, but if the price-per-performance was 2x or better for macs in a couple generations, what % of corporate buyers would switch over?


Apple has pulled off several big hardware transitions: 68k -> PPC -> x86/x64 -> ARM. One reason those transitions are considered successful is that they did a masterful job of managing expectations. Apple showed that the consumer hardware market really doesn't care enough about backward compatibility to affect the bottom line. And thanks to Moore's law, supposing you can legally acquire a ROM, your M1 Mac will happily run 68k and PPC code in addition to the officially supported x86/x64 emulation.

The lack of hiccups is mostly by training the user base to assume that things will just stop working eventually. It's part of the deal with apple; if you're a legacy software enthusiast Apple probably alienated you long before the M1.

> They pulled off a major hardware transition without big hiccups

This is easy to do when you eschew any and all commitment to backwards compatibility. Every major OS update has introduced breaking changes and I gotta be honest, everyone except the most diehard of Mac fans is getting pretty sick and tired of routinely re-purchasing software, not to mention Apple breaking their own applications. Example: obsoleting iTunes by force and coding hooks so it refuses to run without patching the app. (For those wondering, Apple coded a version check into the Mac version of iTunes so that it refuses to launch if the OS is "too new.") Ignore the fact that Apple Music is a shitty replacement that didn't replace all the features of iTunes (like removing the XML library sync, which a ton of third-party apps relied on, breaking them all in one move), but dont't let that stand in the way of progress.

Show me one person that has never been affected by Apple breaking apps, and I'll show you ten people with Avid or Pro Tools rigs who "never update the OS" as a general policy. It's 2022 and I'm still waiting for official EOL/support dates published by Apple. Everyone from Microsoft to FreeBSD does this. Saying "well they traditionally support N-2" or whatever doesn't cut it. People need hard dates so they can plan adequately. Apple's commitment in this area is like depending on your stoner friend to follow through on a promise.


I rarely see this sentiment since the entire Tech class has moved to Spotify subscription streaming as their music platform of choice (slight hyperbole, only slight). But as one of the 12 remaining iTunes app users out there, I am shocked at how terrible the software has become.

The 2005 version of iTunes that ran on my PowerPC macbook was strictly better than the app that I have installed today on my Windows 10 gaming PC. The 2008 iPhone iTunes app was better than the "Music" app on my iPhone today -- when I open my phone in airplane mode on a flight to listen to music, it bricks itself for 10 seconds trying to get wifi connection and prompt me to pay them for their crappy music streaming service. There is no way to disable that self-bricking advertisement.

I suspect that the average Apple employee uses Spotify to listen to music and doesn't have a large personal collection of songs not available for streaming on common platforms. The lack of dogfooding shows through.


Funny you mention that. I recently accepted their offer for a free 3 months, mainly out of exasperation of constantly seeing the interstitial every time I launched the app.

The UX of Apple Music is downright horrible. It proceeded to completely hose my existing music library. I could no longer access songs I had purchased in the iCloud Library, and it threw up a dialog stating I need to sync every Apple device I own with Apple Music to be able to stream music from their service. I was on a trip at the time, so good luck with that. Tons of embarrassing UI glitches, like widgets overlaid on top of others rendering them unclickable. Did Apple fire all their QA staff?


As another point of anecdata: I still use iTunes 10.6.3 on Mac OS 10.13 for this reason.

It's also modded with coloured icons. I still use an iPhone 4S with USB sync, and iPod 5.5G video.

The laptop (2014 Retina 15") has a 4 TB Sabrent SSD upgrade inside, using an M.2 adaptor. The iPod has a 128GB Kingspec SSD.

It's an intentional choice to lag behind, which will probably happen until the Digital Hub model makes a comeback. Privacy is easy to control when it syncs over USB.

I actually downgraded the library from iTunes 10.7 to 10.6.3 so that I might be able to use it on an OQO Model 2, Mac OS 10.5 Hackintosh, or PowerPC. For now though, I still just keep it going on the Retina: the beautiful screen, weight/ports balance, and repairability still make that the best model of Mac IMHO.

When Apple brings replaceable SSDs to the M1 though, I may well consider leaping forward. Ideally with a 3.5" phone again too.


Lack of dogfooding is part of it, but I have to wonder how many product managers they've brought in to convert their player to a cloud music service?

Changes like requiring you to sync all your devices with Apple Music are part of many changes to try and make the transition. It's a big one because a player is very different from a service (historically something Apple have been bad at), but they've had a lot of time. Apple have been able to get this far because going "all in" is accepted by Apple customers more than those on other platforms.


> Yes, I feel like this is somehow still massively underappreciated.

No, it is overhyped (not by Apple), and it worries me. Apple's platform jumps are impressive (and this last seamless platform switch is more about macOS, good decisions NeXT made a very long time ago, developer empowerment decisions), but let's not confuse that with how bonkers every user of M1 and family are going. We have a bit of anecdotal data coming in, no one can believe, and I quote 5000 new M1 owners, "how snappy" the M1 is.

But look at the benchmarks. Each M1 model is a typical and negligible increase in performance over whatever most recent previous Intel model. We are talking about 1.05x performance increases! For example, take the 2018 6-core Intel Mini with the 3.2GHz processors and compare performance to the 2020 M1 mini, and it is immediately apparent, the 2020 M1 mini is really only a little bit more performant. And this is not bad news, it just means everyone is out of their minds, but it is typical of new Apple releases. The new models are always just a little more performant than the last revision (except 2010-2012, when models doubled in performance 3 years in a row and did not double again until 2018).

So the hype on M1 is overwhelming, and the M1 and family are not at all under appreciated. People seem to think M1 is miraculous, and I admit I think it is pretty neat... even after comparing benchmarks with previous models and realizing this is not a quadrupling or even doubling of performance; the increase in performance with Apple switching from Intel to ARM is... merely incremental. This was a lateral move, and not a giant leap forward, not yet. Go look for yourself. But don't be sad about it, again, this is entirely typical of Apple new releases... the new models are always just a little better than the last revision. Of course, performance is not the only metric.

So the hype says M1 walks on water, but misses the truth, that M1 does what x86 does with less power. Again, M1 isn't what a lot of people are saying... it doesn't blow Intel out of the water, it merely keeps pace with Intel, and that is impressive enough... but add the efficiency gains, and what we have are chips as good as Intel's (more or less), but use less power. Anything evangelized beyond this is crazy hype.


MacBooks in particular went through a period with some notable downs--through some combination of design, engineering, and manufacturing missteps. Even my 2015 MacBook Pro had to get its display replaced (after Apple extended the warranty) to deal with a defect. But there was basically no MacBook between then and now that really tempted me to upgrade. (And the 14" M1 Pro is pretty much perfect for me.)

> the M1 era of Apple is the more exciting than things have been in years.

Abso-frickin-loutly. The 2020 MacBook Air M1 is the best laptop hardware device you can buy on the market right now. The battery life is amazing and this makes the laptop almost invisible since I am hardly ever struggling to search for power. The sound is great as well. Price per pound you cannot beat it.

My one small gripe is the black glass bezel, which turns into a distracting fingerprint magnet.

They do need to up their game on SSD storage. but I am sure the MBA's at Apple do not care because this drives people to buy iCloud storage. And if that is the case, they really need to work on iCloud because the syncing sucks.

I would certainly buy an iPad M1 if they let me run apps like LibreOffice, so they need to get their act together on software. Yeah, I have a lot of issues with their software. Software, IMHO, is where they really need to innovate.

Once Asahi Linux is stable I will probably abandon MacOS again.


Yet it still does not have a touchscreen, and I personally would prefer 2-in-1.

Sounds like you might prefer an iPad Pro with a keyboard.

I would instantly go for an iPad Pro, if it would run normal macOS. Or things like vscode and docker, and games. I just can’t justify to myself the expense compared to an m1 laptop, just for the form factor.

unfortunately the ipad os is the limitation on the pro. I tried to make it work but its back to its previous position as media consumption, music production box (which is annoying to deal with due to lack of audio outs), and occasional text editor.

Sure the iPad has its down sides. But it also does complicated things dummy easy. Example tossing a pixelated/blurred box onto a video. That's ridiculously complicated on windows and requires a hell of a steep learning curve. No problem if you have the time. But a blocker if time is and your video editing skillsets are short.

Your example is one enabled by the application, not by the platform. You can find easier video editors on windows or on android as well.

Sticking a keyboard on an iPad does not a laptop make. Being limited to mobile app versions of web browsers is itself a big enough quality-of-life downgrade to make the setup much less convenient than a laptop for leisure-time media consumption, not to mention professional work.

Sticking a touchscreen on a laptop does not a tablet make.

GP wants a laptop with a touchscreen, not a tablet. My laptop has a touchscreen. It's not the primary input method, but it is quite handy sometimes.

I'm not really interested in a touchscreen on a laptop, but I'd buy a Pencil immediately if it worked on my Mac screen.

In what way?

Indeed, lack of a touch screen would be a deal breaker for me. The iPad is attractive, my kids have them, and I might convince one of them to let me use it for a week this summer to see if it handles basic things like Jupyter notebooks and talking to homemade hardware gadgets.

I tried. The limited software and lack of desktop OS made it painful. I wouldn't try it again, personally. It felt like an exercise in compromise after compromise after compromise.

If you’re asking to use macOS with your fingers, you have not realized how terrible that would be. I do not mean in a desktop-OS is terrible for touch input. I mean in a macOS specifically is not built for fingers and would require so much work that Windows has been doing for a decade at this point.

It's not like Windows 10 is touch ready in any real sense, either. Windows 11 fixes some of the basic problems, but the gold standard for a desktop OS that's productively usable in tablet-only mode might ironically be GNOME on Linux.

What I do right now is typing with the keyboard and extensive use of the GUI with my fingers. So I'd still want at least a detachable keyboard, but also use it in full tablet mode. Among its uses, I'd read sheet music from it at band practice.

I have a Dell with a touchscreen. I never use it. The 16x9 ratio is the wrong ratio in either portrait or landscape.

I also have a Dell with a touchscreen, and at 6yo it desperately needs a CPU upgrade. I use touch quite often.

How does touch screen work on desktop? From the fact that nobody but apple has made a decent touch pad in.. 15 years?, I'm first assuming hardware wise it'd suck. But ok, let's assume that works. Doesn't a ton of desktop interfaces rely on hover, scroll, etc? For what purposes are touchscreen superior assuming you have a mouse/touch bar at hand?

I have a Lenovo Yoga 720 back from 2018. There's a bunch of input methods -- touchpad, touchscreen, pen on screen. The touchpad is better than a Mac (just as responsive, but gestures are customizable). The touchscreen is fantastic, prob more responsive than most Android phones. The pen works really well, though it is worse than the Apple pencil. Scrolling is done the same way on a phone. For hover -- most websites are built with mobile in mind, so hover is very rare.

The yoga line does 2-in-1s right. Try to check one out, you'll be pleasantly surprised.


That's great to hear, that's much better than I thought. Beating apple's touchpad i find hard to believe, but I'll make sure to try one out if I get the chance!

I did like the move towards less ports, although it was inconvenient at times. I do wonder if Apple had incentivized the ecosystem to move more to USB-C if things could have gone better. If there were lots of monitors and TVs actually supporting USB-C/Thunderbolt it would be nice, it's a thinner nicer cable, also has more bandwidth.

Let’s say it this way in another domain - no matter the rationalization, Mikey And Bob with Jerry is completely different than with John.

That’s all this is. Not that big of a deal, but without a doubt very different.


> Look at what happened to the MacBook Pro, losing most of its ports and the thinness causing them to put a much worse keyboard in it that caused massive problems. Sacrificing a bit of thinness and going back on those changes with the newest iteration has been much better.

Adding a useless touch bar and losing F-keys also doesn't do much to win over fans, and it should be stressed that the infamous MacBook pro keyboards were a constant source of problems.


I love the Touch Bar!

I don’t love the Touch Bar entirely, but I do really like the slider for brightness and volume.

I’d rather have fkeys for everything else. Maybe they could give us a mini touchbar just wide enough for going volume and brightness.


I thought I loved the slider for volume/brightness, and was concerned about losing them, but then realized how little I cared when I went back to no slider

I would love a full width Touch Bar right above the fn key row they just brought back. I don’t see why it has to be one or the other.

I’m surprised no one made an app that turns Touch Bar into fn (without having to press fn) and a button to switch to the apps choices.

Love Spotify with touchbar, debugging with vs code etc - shame it was hated.

A mini Touch Bar with fn keys above would be lovely!


> I’m surprised no one made an app that turns Touch Bar into fn (without having to press fn) and a button to switch to the apps choices.

That wasn't the issue. My main gripe with the touch bar is a lack of tactility; I don't need to look down at my hands to pinpoint the location of "F5" for debugging in VS Code, nor do I need to make sure my finger is hovering over the escape key before I press it. On top of that, capacitive touchscreens just don't make good buttons, my fingers frequently bump against the screen and trigger mutes and screenshots that simply wouldn't happen with a button. It's something of a usability nightmare.


> I’m surprised no one made an app that turns Touch Bar into fn (without having to press fn) and a button to switch to the apps choices.

This has been a setting in Preferences since a year or two after Touch Bar was introduced.


I think the touchpad would be the better location for the Touch "bar" rather than additional row

Brightness and volume are actually my two biggest touchbar annoyances… w/ older mbps I could simply feel my way to where I knew the keys were via muscle memory and adjust them with a few quick taps (or one long press) without looking or even having my eyes open. Near impossible with the touch bar.

Always good to remember other people can have different experiences, of course, so ymmv.


Volume slider is useless to me. I adjust volume with the scroll gesture on my mouse pad.

Coworkers always talk about how there are people out there who like them but I’ve never met one. Glad that you like it, drives me nuts.

Oh, I would love it too, it's a really cool idea—but not at the expense of the F-keys. It's like adding backup cameras in cars; it's a great idea—but not if it replaces all the old physical radio and HVAC controls.

Ah, but the DJ demo at WWDC using the touch bar.... /s

It is not uncommon for media to create controversy to sell the author's book...

You are not wrong. But in this case there was a significant amount of contemporaneous reporting when Ive left, predating this book.

It doesn't mean the investigation is completely false either, you just have to pinpoint the truth in between.

The funny thing is apple probably transitioned to their in house arm architecture in part because the intel chips ran too hot and throttled in the ultra thin ive products.

Agreed. I have looked at Apple products for years but couldn't make up the mind to switch, until my company gave me a M1 laptop and OMG it is so good. Last time I had this feeling was when iphone 3 or 4 came out.

My previous company also gave me a Macbook pro but that was 2017 and I found enough quirks not to buy one for myself.


Apple's hierarchal organization based on technical expertise is key to its innovation: https://hbr.org/2020/11/how-apple-is-organized-for-innovatio...

Everything else is a complement, but they don't drive leadership internally or externally to stay ahead of the industry.


Thinness matters when everything else is an absolute chonk.

But there’s a point at which it becomes basically worthless to get thinner.


Small nitpick the butterfly keyboard was problematic because of the high failure rate. Many, me included, actually like the feel better and the smaller key travel.

That was one of many reasons.

I would say the majority of people disliked the small key travel, that’s why apple “fixed” it in the latest iteration, specifically calling it out as a benefit.

What do you like more about it compared to the newer one?


In particular my main mechanical keyboard is a 35g electro capacitive one, so I'm been pretty adjusted to lower key weights. The new Macbook keyboards are just too stiff and have too much travel for me to type fast/not get fatigued.

I think even if one is accustomed to heavier keys, all things being equal, if they learned to type on lighter/shorter travel they can get faster at typing.


What keyboard? I've been using a low profile keychron with mt3 caps recently. Great feel with very short travel. Everything else feels so sloppy now.

Niz plum EC keyboard. The combination of 35g + the dome switches are a great combo of light activation but still letting you rest your fingers on the keys without triggering them because the activation is at the top rather than linear.

On the other hand, I use pretty light switches on a desktop keyboard and still was thumping all the way down on the butterfly keys.

I didn't hate how they felt in the act of typing, but I hated that they hurt my hands after a few hours. And (as you note) if you breathed near the laptop, they'd jam.


It's a big deal because a) they never fixed it and b) you have to replace the whole top half including the touchbar, etc. Mine died twice under 3 years of Apple Care and is going out again now. I don't like the short key travel, but that's such a tiny detail to me. The 4 USB-C ports are fine with me except the external video compatibility is such a flaky mess.

I'm stuck using this "cool design" as a desktop now because it costs ridiculous money to repair. And I get to flip my monitor on and off twice to get it to come back from sleep.

This is my 4th macbook pro. Previously, I had 1 battery problem in my first 2008 model and it was replaced at the Apple Store same day. My old macs ran until the batteries finally swelled years later. They weren't just sleek and cool, but super high quality and amazingly sturdy.

The other thing that stinks is that the issue wasn't something accountants did to save some bucks, but a design feature that cost me more.

I'm honestly only buying an M1 because I know that they've left the sexy-at-the-expense-of-the-customer approach.

I think Ivie sans Jobs got too focused on design and not customer experience. Apple made excellent hardware before Ivie, and likely will after. Just maybe not as many fashion shows.


I agree the reliability was a massive problem. I had to replace the keyboard (and thus logic board) for every one of those macbooks I owned until I started using a keyboard cover to prevent the issue (which Apple actually recommends against). No doubt after the 1 year warranty the keyboard would have failed again had I not upgraded on an annual basis.

With my machine of that era I just gave up and started using an old bluetooth keyboard I had around. Even when covered over apple care the third time my F key was showing the signs I just couldn't stomach the hassle.

No, it was problematic because most people hated the lack of depth, even if you didn't.

I didn't say most because I couldn't really prove "most". Do you have a source showing that most hated it because of the lack of depth? Most articles I saw were just talking about the reliability issues.

> Honestly to me the M1 era of Apple is the more exciting than things have been in years

The average person doesn't know or care about M1. If you are on HN, you are an enthusiast ("Pro" in Apple parlance). To everyone else, Apple just made their already pretty quiet and fast laptops, quieter and faster.

I think the article is right that the world is waiting to see if Apple's new bottom-up design org can deliver a new category-owning product. So far, they've proven that they can improve the legacy suite of products in meaningful ways and aren't afraid to roll back poor past decisions. I think the author is probably right that Apple's services org is getting much more attention than in the past.

When everything flows downwards, you get a singular vision, blind spots included. I think we saw that with Ive. This was true with Jobs' Macintosh too, before Ive joined. Today, we have fewer blind spots, but we haven't seen evidence that there are leaders willing to take big swings into new categories. Time will tell...


50% of customers purchasing a Mac in Q2 2022 were new Mac users: https://www.macrumors.com/2022/04/28/mac-users-q2-2022-new-t...

Which is actually incredible when you think about it. They might not know or care about what "M1" is (although I doubt that), but it's clearly a commercial success.


Great stat. I might be underselling the achievement... I bet some of that growth is driven by the insane battery life, which we know people care a lot about. To be clear, I own and love 2 M1-based machines :-)

I stand by my overall comment, though: They are better Macs. Not a new category or a new product for Apple.


They are powerful ARM laptops. I'm considering getting one purely because of that + Asahi Linux. My Pinebook Pro just doesn't cut it as a main machine, and I'm sick of x86.

X86 is bloated. Linux on Mac is a dark forest, though. You might be better off sticking with MacOS despite that it’s gone downhill lately.

I stick to free software whenever possible, so I wouldn't seriously consider using macOS. I even run Guix System (one of the FSF-approved distros) on my current main machine (ThinkPad T440p).

This figure is useless without context. What was this number for previous generations? I suspect it's always super high because a huge demographic for mac's is college students buying their first laptop, obviously it's gonna be their first Mac. Same with software devs. Tons of Macs are used by software devs getting their work computer.

All you have to do is tell someone - 20 hours of battery life and it doesn’t sound like a rocket ship under load.

As an tech professional and coder I can still say that barely 5% of my work is constrained by CPU. I'd trade all their hardware improvements for physical left and right click buttons instead of those awful trackpad gestures.

Have you used their trackpad, or are you coming from the exterior looking in.

One finger left click, two finger right click is sooooo useful and easy to get used to, it's borderline natural.


I've been using macs for dev for last 12 years at least. It's mandatory if you need to support iOS or safari users due to aforementioned vertical integration. Because building iOS apps requires an iOS device, and MacOS device and you still have to pay Apple a yearly fee. I had a one year hiatus where I had a Dell with physical mouse buttons and a touchscreen and it was marvelous.

yea I don't know what he's talking about. the haptic feedback is so good, I'd never guess there wasnt a physical switch. I had to power it down just to be sure. apple knows how to deliver a solid tactile experience.

I have never heard anyone complain about a MacBook trackpad before.

I have also never used a Windows Laptop the comes anywhere near the level of perfection of a MacBook trackpad.


They just released a brand new computer, the Mac Studio. This is not a legacy product.

Global PC shipments down 4% in Q1. Mac shipments up 8%. The M1 Macs are the most exciting thing Apple’s done since the Watch.

https://www.counterpointresearch.com/global-pc-shipments-q1-...


People as talented as Ive still need a champion and a leader to balance his artistic sense to constraints in business. Jobs was that champion. Without the counter balance and support from Jobs, Ive became less effective.

That could just be a cycle thing. In an age where computers were at best boring, Ive and Jobs were what was needed to create the next great products. But maybe we hit a technological wall, and now we need strict hardware improvements.

We all complain about the thinness, but it's not really that is it? It's the sacrifice they made to achieve it that pisses people off. Because we still need the things they threw away. If it all worked as imagined we'd be all over the slightly thinner machines. In some ways, the M1 is going to enable the thinness again.

When we hit the next tech wall, you may need another Ive and Jobs to dream up things.


Ive is just the latest vehicle for the oft retold “apple is doomed” story. The media frequently plays apple as some kind of small company where genius individuals are developing blockbuster products in isolation, and not the hard-slog reality of their product development methodology.

I can definitely see that argument, but I feel the issue is that Ive lacked a product-focussed and customer-focussed CEO to reign in his "wilder" impulses (the Edition watches are another example of that).

Jobs may have managed Ive's drive in a way that would not have seemed bean-countery to him but customer-focussed.

In other words, Ive was only a liability in so far as he had no actual peers at the company to reign him in. And it's debatable that that's not really his fault.


I would have liked to see what Ive could do with the M1 and its thermals.

I think that your comment is insightful in that while Ive's time at the top of Apple product development was far from perfect, his success was undeniable.

Jony might not always have been right but he was always wrong for the right reasons. Principled in his vision for design and human interface. I think his worst successes (lol) were those he had over software UI and removing information from the UI in the name of aesthetics or the Touch Bar in the name of changing how we see keyboards.

His understanding of how to make a computer something you wouldn't be embarrassed to be seen with but also something that you can take with you everywhere (thin and light are good when the tech can match) have certainly made his devices more a part of our lives.

Jobs could match him on this stuff but I would imagine that it would go over a lot of business and engineer types' heads. Takes all sorts.


This is the problem with designing and re-designing where function follows form.

You have to keep changing and the only way to go, seemingly, is away from functional.


I agree. I’m really happy to be in the M1 era. Healthy companies have a good balance between engineering, design, and marketing. The M1 is the direct result of some really talented engineers not the design team.

Ive is a brilliant designer, but Steve seemed to be able to reign in his worst instincts. From everything I have heard about Ive, he reminds me of a few designers I have worked with who were wonderful designers, but struggle to understand the importance of practical constraints such as engineering, cost, user feedback, etc.

The Apple Watch is a great example. It is a wonderful device, but the whole launch seemed confused. Remember the 24k Gold watch? That was the least Steve thing Apple has ever done. Steve believed in well designed and built products, but was almost allergic to excess.


That makes him a pretty poor designer, particularly in the industrial sense.

If you can’t balance those constraints you are not a good designer.


Ive reminds me of a pretty good designer elevated at the right place at the right time

If the article is to be believed, Ive had already stepped away, at least mentally, by late 2014. Which does mean that the much maligned 2016 Macbook Pro might not have been fully his fault. Tbh that seems a lot more likely than the usual HN "Jobs kept Ive in line" argument. Ive had been leading design at Apple for a long time. It's be a little odd if he didn't learn any sense of pragmatism from working with Jobs or from being the design leader.

Instead it makes a lot more sense if the 2016 MBP was a result of the other designers being let loose with no supervision from Ive. Lacking leadership, they went in on no ports and a gimmicky Touch Bar.


>Honestly to me the M1 era of Apple is the more exciting than things have been in years.

Not if you look at the style. It's another a boring grey slab. Jony Ive's job was making everything fashionable.

While computer geeks tend to under appreciate fashion, it's incredibly important. Fashion is the main difference between a cringe bluetooth headset and Airpods. Right now Apple's best offerings seem permanently stuck in 2010s fashion, and it's getting really tired looking.

Frankly, I'd rather be seen sporting a tablet thin Samsung galaxy laptop with punchy colors, stylus, and 360 hinge than a Macbook now. I don't care if the M1 has 50 hours of battery life. It looks boring. I don't want to look boring. I don't want to look like I'm permanently stuck in 2013.


> It looks boring. I don't want to look boring. I don't want to look like I'm permanently stuck in 2013.

Your consumer gadget choices should not define you or your "look".

Nobody else cares what computer you use. It's fine.


For you they don’t but for many if not most they do. To most consumers the products aesthetics is the only deciding factor between competing products

The M1 arrived 13 years after the iPhone. A company of the size of Apple not putting out new products in keeping with its size is basically a lack of innovation. Some of my friends, who are Apple employees (and basically fanboys) can't stand to hear this, because they think that Apple is the greatest company on the planet (it may be, in terms of market cap, but debatable on all other fronts).

But, it's pretty difficult to follow up the iPhone and iPad with something that's as revolutionary. There are people who think the watch is in the same category, but I doubt it. It does sound like Ive was burned out, which the article points out. Once you are burned out, you are really rationalizing, ipso facto, your decision to leave, which has already crystallized in your mind. The article does point out that Ive was burned out and could not manage the large group effectively because he was overwhelmed. In the end the Gods were cut down to size. It sounds entirely plausible that Ive did not produce much (aside from the campus, whose aesthetic and functional value is questionable) and made poor calls about the marketing of his first "independent" product, the watch (which was reoriented from fashion toward fitness by Cook), because he was burned out and bean-counters were running the show, sometimes rightly questioning extravagant designs in things such as the Apple campus.


Legal | privacy