Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login
ARMed mac: not again or for real this time? (mondaynote.com) similar stories update story
65.0 points by miles | karma 22549 | avg karma 9.24 2020-03-09 00:29:38+00:00 | hide | past | favorite | 178 comments



view as:

The problem seems much more software based than hardware based.

The Mac has lost so much software through Catalina's 64 bit transition, and that was telegraphed a decade ago. Between that, Apple's collapsing quality of documentation, seemingly having two competing development frameworks due to internal infighting, the prevalence of Electron, the loss of similarity to software for popular Linux x86 distributions, and Metal being extremely uninviting as a dev wanting to do any crossplatform work can Apple really assume the software will come for an ARM based Mac?

I am very worried it wouldn't. Last time the Mac underwent a transition it ultimately made things easier for developers and there was a fairly thriving (if niche) Mac software base. There simply isn't any more. Mac gaming is going backwards at a rate of knots and dual booting to Windows is more important than ever. Large companies seem to just be abandoning Mac drivers entirely (glares at Sony dropping support for their camera software for Macs rather than transition to 64bit).


>> The problem seems much more software based than hardware based.

- During transition from Motorola's 68k series of chips to IBM/Motorola PowerPC processors, Apple produced classic MacOS versions that could run on either platform.

- During the transition from PowerPC to Intel, Apple produced Mac OS X versions that could run on either platform.

- In 2005, Apple shipped special Intel build machines for devs only.

I think there is a method to this madness and despite all the flaws we currently observe, which are orthogonal to the transition, I trust them to get it right, if and when it happens.


The earlier transitions involved going from a less powerful cpu to a more powerful cpu so emulation was an option. Now, at best you are going to an equivalent cpu and at worst a less powerful cpu.

Depends how seamlessly they can get devs to re-compile. ARM having to emulate x86 would be definitely slower.

There is no reason to believe the per-watt performance of actual ARM code won't outperform equivalent x86 chips. i.e. an A(x) Macbook Air replacement could easily outperform the Intel variant at the same power envelope, based on the observed performance of the iPad pro my wife uses.


Apple has some pretty decent IC design engineers... And ARM has fewer legacy corner cases to support and a more dense simpler to decode instruction set. Apple also has high margins so can afford to have big silicon dies with lots of logic and cache.

Given all that, I think it's totally possible Apple could release a CPU with more raw performance and performance per watt than Intel/AMD.

That extra performance would allow possibly emulation of x86. The emulation world is much more advanced than it was 20 years ago - dynamic recompilation could easily lead to a 50% slowdown rather than a 5000% slowdown that interpreting might be.


Do you have a source for that? The thesis of this article is that it is a more powerful CPU, in terms of performance-per-watt, which is the same metric that spawned the PowerPC-to-Intel transition. The author pointed to specific ARM chips that exist today which "consume less power, about 210 watts, than a competing Xeon CPU needing as much as 400 wats, for about the same amount of computing".

The documentation quality is so true. Conference videos are not docs!

For some of the lower level stuff, all the good documentation is marked deprecated, hidden as comments in the framework headers (curiously stripped in the online API references), in some mailing list dating back 20 years, or in a forum that requires login to read and isn't indexed by search engines.

It's like if you want to write a pro media app on MacOS today you need to work for Apple or at a company that's deployed a pro media app for decades.


>can Apple really assume the software will come for an ARM based Mac?

This is what Catalyst is for, no? Mac software is now iPad software. Macs are now somewhat more capable iPads. The golden age of computing is over.


I think even the most delusional Catalyst fan would admit Catalyst is nowhere near ready for prime time and will not be in the timescale for these supposed ARM Macs to launch, if ever.

Adding catalyst didn’t remove the ability to use any other UIKit, or to compile and run whatever you want in any language there is. Everything that ran before continues to do so, and it’s almost oxymoronically wrong to consider a system that is a strict superset of another less free.

Right, and we could find ourselves in a time when any meaningful non-developer app is only available as a Catalyst app.

It’s kind of how like Microsoft’s “embrace and extend” made us less free, because it’s actually “embrace, extend, and (later) extinguish”.


I'm worried that a lot of app devs might just move all the app logic to an Electron app or entirely online as a webapp...

The days of desktop apps are numbered already, and this might accelerate that transition.


Unfortunately I have heard these rumors way too often. I won't believe anything anymore until Apple itself releases a product.

We kept hearing stories about x86 for years too. I'm sure ARM based MacBooks already exist as testing devices.

Did successive stories about the x86 change provide additional information before the actual release was imminent? The ARM stories are not.

They never did, just new sources of rumors.

Microsoft has already released Windows-on-ARM and there are several vendors making ARM laptops. So far adoption seems slow. I think the initial ball drop was running an emulated x86 Chrome binary. I know time to market is important but porting Edge and not porting Chrome was a big mistake. Now that Windows Chrome (and firefox) is built native, I'd bet the biggest challenge is getting critical third party vendors like VPNs.

Apple has fewer third party software vendors AFAIK so they would probably be able to move to ARM with fewer challenges. That said, they have traditionally had a lot of customers in creative industries with specialized software from Adobe. Getting the vendors to port, moving quickly, and trying to keep business plans secret will be tricky.

I have an i5 thinkpad for work that gets atrocious battery life. At home, I have an XPS13 (also i5-based) that gets ~6+ hours which is good. I should be able to go the entire work day on a single charge. All of my build/compile/test work is done remotely on desktops and servers, so my laptop has relatively light duty.


ARM Macs will be the end of the line for many "prosumer" customers of Apple.

It will probably sell quite well though. And it will bring back fanless designs.


> ARM Macs will be the end of the line for many "prosumer" customers of Apple.

Why do you think that? Do you just think ARM can't match Intel performance? Or do you think the prosumers need Intel-specific functionality like counters?

What I'm worried about is implementing top-quality ARM backends for software like the JVM. We need someone doing this rapidly as I think an ARM MacBook Pro will just turn up one day, unannounced.


There have been Hotspot backends for ARM and AArch64 in OpenJDK for several versions now.

Depending who you talk to, you'll get the opinion that they're much lower quality than the AMD64 backends. I think that may be particularly an issue in Graal. Some people will tell you that the way Graal is designed is problematic for ARM and that's a long-term problem. I don't know if that's entirely true or not, but now we need it solved quickly if it is!

I used to think that, but apparently nowadays ARM JVM is pretty good. Have no idea about Graal though (and I have no idea why would anyone need Graal at all).

> I have no idea why would anyone need Graal at all

It has new optimisations like partial escape analysis that C2 doesn't have. In production this saves Twitter 13% on compute resources in their data centre. That's MASSIVE. I presume it's tens of thousands of dollars a month.

People also use Graal to compile Java applications to standalone native executables, which isn't something C2 can do.

And finally Graal is a system that you can use to run languages other than Java. For example JavaScript on Graal, which runs on a JVM but is about as fast as V8.


> I presume it's tens of thousands of dollars a month.

You can probably add a zero, if not several more, assuming the 13% number is accurate.


People are constantly predicting the death of Macs. Every time a rumor comes out, “well macs are for real professionals anymore” and every time a new Mac is released “well Apple is finished with real professionals”. Never fails.

I don't even know what people really mean by 'professionals'. I think it's 'people with my exact workflow'.

I mean a lawyer is a professional and they can probably do fine with a 2014 Air.


Somehow I think they always mean GNU/Linux devs using macOS as pretty UNIX.

Oddly enough, when I first had money for a computer I picked whatever was closer to an UNIX. The Mac won because Thinkpad came out of the box with Windows and they never officially supported any Linux flavour. Today I would compare the Mac with the Dell XPS which is Linux certified. Different times.

Fair enough, but that is a niche use case, and not what developers registered on ADC care about, nor the users of their applications.

I think it's obvious most of what's said here is one's personal opinion. I mentioned 'many' prosumers.

I believe there are 'many' Java developers that use Macs but in the grand scheme of things I imagine it's a tiny sliver of the overall professional user base.

So, yeah, I don't believe Apple is committing suicide with any market move they make. They have the data, they have the marketing, they will do well.

But Macs don't seem like a good future developer workstation to me. Not anymore.


Agreed, by the way in what concerns Java, ARM support is doing just fine.

Obviously they mean professionals who partner with computers that perform specialist tasks. Software developers, 3D movie animators, etc.

A lawyer with a camera is a "professional" "photographer" but not necessarily a "professional photographer"


I think that's the definition most people use, even if they don't admit it. :) "No one can get real work done on an iPad" is essentially the same statement: what they usually mean is "I can't (or don't know how to) get all of my work done on an iPad."

> I mean a lawyer is a professional and they can probably do fine with a 2014 Air.

Perhaps we should replace "professional" by the more clear term "power users" to avoid this nitpicking.

Clearly, a lawyer is a professional in the sense of the job he is doing, but is also clearly not a power user in the sense of computer usage.


The JavaScript engines have been doing quite well on ARM for a while now, so I guess someone will just need to put in some love to bring the JVM up to par as well. I doubt it will be Apple, but perhaps we'll see one of the other major companies do it once ARM servers become mainstream.

Does much important software still use Java on MacOS? Performance cross platform software would more likely be using Qt, and server Java runs on Linux.

> Does much important software still use Java on MacOS?

Android development.


> Performance cross platform software would more likely be using Qt

Qt is a graphical user interface library - Java is a programming language and runtime. They aren't really the same thing at all and it doesn't really make sense to compare them.

> Performance cross platform software would more likely be using Qt, and server Java runs on Linux.

Guess where most people develop that Java server software? They're on macOS!


Actually Windows, macOS on Java shops isn't that common in Europe and company IT tends to provide Windows images to developers as well, as per desktop market.

>What I'm worried about is implementing top-quality ARM backends for software like the JVM. We need someone doing this rapidly as I think an ARM MacBook Pro will just turn up one day, unannounced.

Nope. Never gonna happen. There are too many moving parts in a processor transition (and Apple has done two so far: 68xxxx to PowerPC to Intel, not counting the 64 bit transition). There's always more than a simple recompile (if that's even possible as some of the big players may be using their own proprietary toolchains for building, although I imagine this is less common now than it was during the last two processor transitions).


All rumors point to an ARM Mac, and some very credible ones point to one rolling out within a year. Apple has been gradually abandoning cruft that they probably didn't want to bother porting across architectures.

I have no idea whether a transition is imminent, but Apple is famous for abandoning cruft, so I wouldn’t treat that as indication of anything.

I think the transition will happen, but I don't think the post above worrying that "an ARM MacBook Pro will just turn up one day, unannounced" is going to be correct. It's not impossible, but the PowerPC-to-Intel transition didn't happen that way -- they shipped developer-only machines in mid-2005 and didn't start shipping consumer Intel machines until January 2006, and took nearly all of 2006 to make that transition. I'd expect an ARM transition to be similar, and if anything, slower.

The not going to happen isn't about the transition but about it happening suddenly.

macOS is getting more locked down in time to the point of being iOS with a keyboard. This makes a lot of developer tools hard to use, eg. debuggers don't work anymore https://jdk.java.net/macos/

So, software changes alone require quite some upkeep on your existing tooling otherwise you run into problems.

Now imagine a whole new CPU architecture which means none of your tools work unless they are a) recompiled or b) virtualised.

I expect the pain of virtualisation and the problems with that to make the whole experience miserable.

And many tools will either never be recompiled or take so much time you are forever stuck in pleistocene times.

I remember, ahem, fondly when Apple dropped Java and we had to wait forever to be able to work with modern versions on OS X. At some point Landon Fuller stepped in and hand-made a Java 6 build.

Catalina is also quite famous in my circles as something to avoid.

And the notarization requirements for macOS tools that are not on the macOS AppStore is also something that has been wasting some of my time (and the $99 for Apple, of course).

So, all in all, I don't see how ARM Macs combines with the Catalina successor and even more stringent Apple rules are going to make a good developer workstation.


You may download a version of Java lacking notarization if you'd like to debug it.

I'm doing http://www.openbeans.org which is an IDE supporting Java among other programming languages. It's made in Java too.

So, I need to distribute a notarized build that also provides a JDK for users. My users would naturally want to debug their own code using the provided JDK.

So either we just forget about notarization on developer tools and we just tell people to disable everything or Apple has to figure out some better way. I already see they apply the notarization rules differently so perhaps if you get your tool on the "internal Apple tools whitelist" it's easier.

I programmed on Macs since the G4 days but I will not be upgrading the current software or hardware given the current trend. Just switch to a nice Thinkpad running Linux (or a BSD if I feel adventurous).


You can debug Java code without running into this issue; the problem is when you try to debug the JVM itself.

I'm not quite certain what this means.

So, if my JVM forks another JVM that will debug just fine?

Or even that won't work as in I won't be able to add a breakpoint inside some standard Java class?


You won't be able to set a breakpoint inside native code: the JVM itself, and any JNI libraries it loads (assuming they too are signed the same way) are off limits. But setting a breakpoint in a Java class should be fine, because that inside the virtual machine and doesn't require special entitlements.

Thank you. This was quite illuminating.

> So either we just forget about notarization on developer tools and we just tell people to disable everything or Apple has to figure out some better way.

What would be a better way?

A computer either is or is not allowed to execute non-Apple approved code. If you want your computer to execute non-approved code, then let it do that by disabling Gatekeeper.


> This makes a lot of developer tools hard to use, eg. debuggers don't work anymore https://jdk.java.net/macos/

They don’t work in notarized apps, and only notarized apps are allowed through Gatekeeper. But if you’re a developer who needs advanced debugging capabilities, you can let the non-notarized app through Gatekeeper and continue on your way.

But maybe you're a hardware developer and need to debug the kernel, or replace the kernel entirely with a self-compiled version of Darwin. Macs let you do all of that too, just turn off SIP and the system is yours to command.

What Apple has done with macOS is lock down the experience by default so your average technophobe doesn’t ruin their system. If you’re a programmer or power user who wants or needs a less restricted environment, macOS can still be that platform.

I think it's a healthy balance.


> your average Grandmother

Please don't do this. Use a perfectly adequate and more correct term "layperson". My grandsons are equally capable of ruining their system with trojans and misapplied hacks


Thanks, you're completely right! I've edited.

> I expect the pain of virtualisation and the problems with that to make the whole experience miserable.

They did it twice for mac os - m68k->PPC->X86 and came out alright. Also transitioned two generations of ARM (current CPUs no longer support 32bit).


And let's not forget that OPENSTEP happily ran apps that were compiled to work on 68k, 486, PA-RISC and SPARC.

> Why do you think that? Do you just think ARM can't match Intel performance?

Can it match single core performance? Audio production plugins are often running on single cores. The A13 maxes at 2.65ghz, and the 80-core Altra maxes at 3.0 (turbo boosted). Both of those are too low for me to use in audio production. I intentionally went with less cores and a higher clock speed on my last computer because of this.


I'm not sure you can compare processors with wildly difference ISA on clock-speed alone like that.

Clock speeds aren't comparable between different architectures.

You're also looking at a CPU designed for a phone, and a CPU designed for a server farm. If Apple were to design a CPU for professional multimedia workstations, it would not have the same parameters as either of these.


> Both of those are too low for me to use in audio production.

What do you consider the minimum to be usable in audio production?


Apple's chips have some of the best single core performance in the industry.

[quote] It will probably sell quite well though. And it will bring back fanless designs. [/quote]

Fanless? Wishful thinking. Check out the Raspberry Pi 4's overheating problems.

If they make an ARM Mac, it won't be fanless.


A laptop with heat sinks will have a larger area to dissipate heat. Where the perfect balance is remains to be seen as always.

Thankyou.

ARM and x86 are both capable of very high performance. ARM isn't magically capable of putting out 10x the FLOPS in the same power envelope. Modern x86 parts are very close to ARM in terms of work completed per unit energy consumed.

The difference is that it's rare to see an ARM part with a heatsink/fan and it's rare to see an x86 part without one. RPi and MacBook 12 are notable exceptions to this.

All of this presumes that CPU utilization is what's eating the power. It's not. For an idle machine (e.g. ARM iPad vs. x86 MacBook) most of the energy goes into the display backlight. The SoCs idle down to near zero energy consumption.

It's even difficult to assert that CPU usage is a big power culprit on these machines. My MacBook 12 can max out CPU forever and hold 1.8GHz with no fan. The 4 core post-Haswell MacBooks can max out their CPUs and have the fan running very slowly. But run the GPU hard on either of those machines? You'll max out the fans and power consumption immediately and things will start throttling.

TL;DR: CPU arch makes no difference. ARM Macs will have similar cooling solutions to the current ones.


I still think that the AppleTV is where this will start. They now support multiple accounts, have an app store, etc. iOS supports keyboards and mice, and they have a "conference room display" mode that brings them into a business setting.

Release the iWork suite and some sort of thin client/remote desktop app and then see how people use them. Learn, iterate, repeat. They can build an ARM-based Mac bit by bit, out in the open, without anyone freaking out and killing their x86 sales.


Apple TV is already ARM (at least the new ones are…), and it runs what's essentially a skin of iOS. I doubt they will switch that out with macOS anytime soon.

The original Apple TV (that looks sorta like a half-height Mac Mini) was based on a stripped down version of OSX, but you are correct, everything from the Apple TV 2 and on is ARM and running a derivative of iOS.

Isn't iOS a version of Mac OS? :-)

Do they share any new code nowadays?


They share most of their code.

Now, sure, but not back then to the same extent.

The Apple TV v1 was released/shipped before the iPhone was. It was based on OS X 10.4.7 with a modified version of FrontRow (remember that old app?) as the primary UI. You can read a bunch more about it here[0]. There was nothing remotely iOS in it until Apple TV v2 which didn’t come out for like 7 years after.

Yes, iOS and MacOS are very similar now, but they weren’t that similar in 2007. Yes, both were Darwin based, but from there up they originally were pretty distinct.

[0] https://www.macworld.com/article/1057029/atv_hacks.html




DO you mean apple TV's will be more like a 'Apple mac teeny tiny' where they work more as a low powered desktop than a media player? It would be interesting to see

That's exactly what I imagine. They currently sell two versions of the AppleTV - take the higher one and make it slot below the Mac Mini.

Any rumors if iOS will ever support multiple accounts in the future? It's infuriating how bad support is for people who travel and/or live abroad. I need to be on the USA App Store for various USA only apps and I need to be on the Japanese App Store for Japan only apps (examples including banking apps) but iOS doesn't allow multiple stores or multiple accounts and switching stores cancels all your subscriptions.

It's hard to believe no one at Apple has this issue but maybe they all just use 2 devices or something.


I think that your use case is not what multiple "accounts" would be designed for. Have you tried keeping multiple backups, one for Japan and one for USA? That seems like it would work, but be extraordinarily terrible!

As for multiple accounts, they exist on iOS. You just have to be managing the devices for education:

https://help.apple.com/deployment/education/#/cad7e2e0cf56

When I mentioned multiple accounts, I meant on AppleTV. AppleTV can switch accounts in "consumer mode" so each person gets their "personalized experience"

Honestly, it's a mess. But they can improve by working on it and paying attention to real-world situations.


"The iPhone and iPad have run on Apple-designed ARM chips since birth, and while early entries into the Axx line were underpowered, the latest chips created by Apple’s silicon design teams have yielded laptop- or even desktop-class performance"

This is incorrect. iPhone's SoC until iPhone 4 was a 32 bit ARM Samsung SoC [1]. From iPhone 4 and onwards they were designed by Apple. The first iPad also had a SoC not designed by Apple [2].

I fail to see how the details of this are relevant though.

[1] https://en.wikipedia.org/wiki/IPhone_(1st_generation) / https://en.wikipedia.org/wiki/IPhone_4

[2] https://en.wikipedia.org/wiki/IPad_(1st_generation)


Those SoCs were designed exclusively for Apple products by Apple and Samsung. They were branded Samsung because then Apple failed to notice the importance of brand power of APs.

> Those SoCs were designed exclusively for Apple products by Apple and Samsung.

Seems true [1].

[1] https://en.wikipedia.org/wiki/Apple-designed_processors#Earl...


Pretty sure they're no longer manufactured by Samsung. It's been TSMC since the A8 [0].

[0] https://en.m.wikipedia.org/wiki/Apple_A8


A9 was sourced from both Samsung and TSMC.

Apple knows this... just doesn't have in-house SoC designers.

But they acquire this company: https://en.wikipedia.org/wiki/P.A._Semi and team start developing ARM processors.


I think the hypothesis at the end of the article ("Could the iPad’s rising revenue (6.5% of total) help cover the hit once its user interface (and keyboard with trackpad) makes it more laptop-like?") is most likely.

I think Apple will continue to add features to its tablets to make them take over the niche of its OS X driven computers. Already I have student friends who use an iPad + keyboard + stand as their primary device for productivity work.

If they just keep doing this at a certain point the segment will be large enough to just consider this type of configuration a "Macintosh"


> Already I have student friends who use an iPad + keyboard + stand as their primary device for productivity work.

I use my ipad for everything except programming (and all work-related tasks are currently done on a work laptop due to being in a highly regulated industry, for now) and gaming (besides a few games that work exceptionally well on tablet, I never got into mobile gaming, I have a PS4 anyway).

All of my web browsing, youtube watching, document editing, spreadsheets etc I do on ipad. I even occasionally SSH’d into a server to fix some stuff and used a git client to push some minor edits. I also quite like ipad for animation, photo editing and 3d modeling (although I’d like to learn blender since the UI updates...), and Miro for digital whiteboarding.

I’m pretty happy with it, but though I’ve tried various setups, tools and editors, I’ve not yet been happy with actually programming on ipad.

Note that outside of the ipad, my preferred setup is actually a very keyboard-centric tiled WM Linux setup (i3 on manjaro currently), but I’ll use OSX happily enough (and do for work). I was never really into Apple products much, but I love my three year old 9.whatever inch ipad pro.


Congratulations. I'd love to only carry an iPad as it's < 1/2 the weight of a Macbook Air and it appears to be faster (runs Shadertoy shaders faster).

Unfortunately even simple things I find infuriating like trying to select text with my fingers.

Trying to multitask while planning a vacation/business trip and trying to flip between note taking, 4 hotel sites, 3 airline sites, airbnb, and multiple instances of google maps (oh right, only allowed one map view on iOS and one AirBnB view) ...

Shopping online I find similar, it's so much easier to compare items on a desktop ATM with 12+ tabs open and some split in into separate windows.

The majority of tasks I do require multi-tasking and often seeing 2-3 apps at once.

Even trying to write replies on HN is almost always infuriating, especially if I want to grab some links to paste into the reply, but also if I see a typo and try to use my fingers to move the cursor to the place to edit.

Maybe it's a learned skill or maybe I just use computers differently but I'm always fascinated and somewhat jealous of people like you that some how manage to avoid all that frustration on an iPad.


> and multiple instances of google maps (oh right, only allowed one map view on iOS and one AirBnB view) ...

I'm not too experienced with this but I'm 99% sure this is in fact something you can do in the latest version of iPadOS. You can have three apps on screen at once too.

The problem is that the gesture system for doing so requires too much effort versus just using a laptop with multiple windows.


Yes, you can have two apps side by side and a third floating app.

Do you feel limited in multitasking?

Rarely. Sometimes I wish the side by side window splitting would be more flexible or quicker to access, but overall, I don’t find it holds me back. Even on desktop, while I use a tiled window manager, I tend to prefer a single fullscreen application and just switching between fullscreen applications (although my editors tend to have splits and such, so it would be similar to multiple tiled windows).

Also when I’m not using a keyboard, I find touch-based multitasking quite pleasant.


Exactly they would be pretty stupid to alienate devs that develop for the server x86 world while they can easily extend the iPad pro line more and more without touching the MacBook line.

It's still super weird to me that Apple has a tablet that costs more than a laptop. And I think in its struggle to try to make the world make sense again, my brain has latched onto the notion of Macbooks and iPads sharing a processor line.

It seems like design and manufacturing of their beefiest ARM processors would bear the brunt of a lot of R&D expenses. That's the iPad Pro right now. If the regular iPad, iPad mini, and iPhone processors are essentially refinements of an earlier iPad Pro core (and/or manufacturing processes), then of course they'll be cheaper; the biggest costs were recouped on the Pro.

If their flagship processor is developed for a Macbook, or the Macbook and the iPad Pro, it doesn't seem like that big of a stretch to expect that this should lower the price of iPads at least, if not the whole catalog.


[quote] The Pro is a monstrously powered machine that costs tens of thousands of dollars and is designed for a (relatively) small audience of content creator professionals and other high-end technical users running fluidics dilutions and the demanding calculations involved in machine-learning applications. [/quote]

I really don't get that. You can have a $8000 Mac Pro that is decently configured as a general purpose workstation that will last you 5-10 years if it's well built. Yet everyone only talks about the $50000 or whatever the high end config is.


Not sure how much you can get a decent configuration of the Mac Pro for just $8000, but this is a very steep price, considering that the Mac Pro just doubled the entry price compared to its predecessors. From the build quality, this is certainly one of the nicest machines one could get, but the pricing is a new pinnacle even by Apples standards.

Sorry, 7000 US prices. 48 G ram and 2 Tb storage, rest default.

Not everyone needs more than 8C/16T or uses GPU compute. Which was my point. The price may be inflated (not sure by how much, a similar Dell is what, 5K instead of 7?), but there are configurations that cost less than a good car.


There’s one thing I don’t think Gassée, or many of the people predicting an ARM transition for the Mac, are considering: user patience and goodwill, basically political capital with the user base.

Apple blew through an incredible amount of it on vanity projects in the 2010s, from the abandoned trashcan Mac Pro to the Touch Bar and butterfly keyboard. They’ve clearly changed course with the 16” MacBook Pro and the 2019 Mac Pro, but they’ve asked a lot of users over the past decade, and this doesn’t quite seem like the right time to ask the entire user base to please be patient once again.

In other words, while there’s little question that a switch to ARM is worth the headache for Apple, the critical question is whether Apple can make the case that the switch is worth it for users.


You left out one important taker-of-goodwill: loss of 32bit on Catalina.

But, the fact this happened just now, in 2019, makes me think Apple doesn't recognize this is a problem at all, or thinks it's too small of a problem to matter.


I thought Mac OS had been 64 bit only for years and am amazed they kept 32 bit support for that long.

When they decide on something they usually do it for real and I appreciate that, because it moves the ecosystem forward.


> thinks it's too small of a problem to matter.

They know. The loss of 32-bit apps means little to them, because they know it means little to users, and that's why they went through with it.


What are you basing this assumption off of?

For me, it's an enormous loss. I'm certainly not all users, but neither are you.

I think it's pretty clear at this point that the Mac user base has not reacted to Catalina very well. Upgrade rates are abnormally slow. A lot of that is just due to general buggyness (also a problem!), but I think 32bit is some of it as well.


The key is they wouldn't have done it if a large enough number of users relied on 32-bit apps. Pretty obvious.

> A lot of that is just due to general buggyness (also a problem!)

This is the only thing you're right about. People don't care about 32-bit apps.


Most non-tech people that use macs that I know of complained about apps not working anymore (old printers, tablets, etc) (and even paid me to fix it)

And getting rid of Dashboard, which had no technical reason to kill even.

Daily I find myself cursing that I can't quickly bring up a calendar or unit converter with the dashboard shortcut.


Apart from sentiment, does anyone, including Apple, still care about OSX? Seems that the potential move to ARM will merely be the final nail in its coffin. Apple has removed a lot of backwards compat over the last releases, large app developers seem to not be embracing new systems/APIs, software largely isn't being ported, iOS now has the iPad pro, soon perhaps a laptop.

What is the reason for macOS at this point?


to run xcode to build iOS apps.

(even my friends who were long-time Mac users/developers are moving off; for many of them, it was the slowly decaying UNIX component of MacOS X, which they originally touted to me)


Pixar does, along with many other video/movie producers. There are also a lot of professional musicians and recording engineers using Macs as well. Not to mention all the developers using Macs for iOS app development. None of these activities are realistically viable on an iPad Pro. I think the real issue is OSs aren't very interesting at this point - they're mature and there isn't a lot of sexy sizzle to be added to them.

Doesn't Pixar mostly run Linux? [0][1] And if I remember correctly, before Linux they were an SGI shop.

[0] https://www.youtube.com/watch?v=x9ikzGQW0ys

[1] https://www.youtube.com/watch?v=hnFSVx7NhmM


there's a Qemu implementation that now runs on iOS, and with services like AWS Cloud9 you can get away with developing on an ipad only setup, at least for most webdevs. Sooner than later someone is going to be able to run android on this: https://github.com/utmapp/UTM

and then the gates will open to ARM heaven, once developers find a way to do their jobs on an ipad the rest of the industries follow


QEMU for iOS, Bochs, has been a thing for a long time. I remember running Debian on my iPod touch 3G.

edit: Contrary to my memory it's its own project and not QEMU based.


I still care about macOS because it is still the most pleasant desktop environment to use, in my opinion. I love macOS's pervasive support for PDFs and macOS's font rendering. I also love the iLife suite of applications. With that being said, I've been trying out alternatives to macOS since I've been disappointed with Apple's stewardship of the Mac under Tim Cook, but I keep going back to macOS, though I haven't upgraded to Catalina yet since I still rely on some proprietary 32-bit software packages that I don't feel like repurchasing until I make a final decision about how long I'll stay on the Mac. I tried Windows 10 and I sometimes use it for work; while I can get by, the experience is just not as pleasant for me. While Windows 10 does have some nice features, including the wonderful WSL layer, I don't like the ads and the notifications to use MS services, I find the UI ugly (what's with the huge title bars in Windows 8 and Windows 10?), and the mandatory updates are annoying. I miss the days of Windows 2000, and I'm awaiting the day I can use ReactOS as a daily-driver OS. As far as Linux desktops go, I feel most productive with Cinnamon. However, I find myself missing certain macOS features and certain proprietary software tools (such as Keynote and Microsoft Office), and I also find myself missing the fit-and-finish of macOS.

Nevertheless, barring a dramatic leadership change at Apple, I see the writing on the wall regarding the future of the Mac. I don't think the Mac will disappear, but I definitely see it becoming more locked-down and more iOS like, which is a trend I don't like. I'm going to stick to macOS Mojave on my 2013 MacBook Air and my 2013 Mac Pro, which are my daily drivers. But I can see myself switching sometime around 2022 or 2023.


I was kind of in your position, and I ended up taking the totally ridiculous route of downgrading to Mavericks.

I didn't feel like modern macOS was a platform I wanted to use anymore, but between Windows, Linux, and old macOS, old macOS clearly won out. It's snappy, stable, and performant; it's visually beautiful; and it has a plethora of well-designed software I enjoy using.

So I built a Hackintosh out of the fastest 2013-era hardware I could get. I'm hoping to stay here for the next decade. I'm not necessarily recommending this route to anyone else, but it's working out well for me.

The next step is to decide on a Mavericks-compatible portable device, because going back to High Sierra on my too-new laptop feels yucky. I kind of want to Hackintosh a tablet...


I completely understand you. Last summer I was learning GNUstep development on FreeBSD, and I decided to fire up an instance of Mac OS X Tiger running inside of QEMU on my Mac Pro to compile my code using XCode to see if it worked there. I was blown away by how polished Tiger is, even though it had been 14 years since Tiger was released. While for web browsing and security reasons I wouldn't use Tiger as my daily driver today, I found myself more impressed with Tiger than with any non-Mac desktop environment.

I admit, I have a soft spot for the classic Mac OS, stability issues aside (though I think projects like A/UX and Rhapsody were interesting examples of bringing the classic Mac UI to Unix). But Mac OS X through Snow Leopard was the pinnacle of the Unix desktop. Not only was it beautiful to look at, but it was a joy to use, and it ran on some of the best consumer computer hardware ever made. I still use macOS, but Apple as of lately doesn't spark joy for me anymore; it's just the best option out there.


What about browser support? How long will the latest Chrome/Firefox run on Mavericks?

I couldn’t live without the latest Adobe CC as well. I fell like sticking to old OSs is less and less viable these days.


Chrome doesn't run on Mavericks, but Firefox does. I can't say how long that will last, but I can look at what's available on even older versions of OS X as a guide:

Mountain Lion and Lion can run Firefox Legacy, which is just a lightly edited branch of mainline Firefox: https://parrotgeek.com/fxlegacy.html

Snow Leopard can run ArcticFox: https://github.com/wicknix/Arctic-Fox

And Tiger and Leopard can run TenFourFox: http://www.floodgap.com/software/tenfourfox/

ArcticFox and TenFourFox don't quite have 100% compatibility, so they wouldn't be ideal, but they're still very usable.

So while browser support is definitely my biggest concern, I suspect there will continue to be options. It's also worth mentioning that the reason Firefox hard can't run on Snow Leopard or below is because Rust doesn't work below 10.7. Yet another possibility is running Google Chrome in Wineskin.

For Adobe, I'm stuck on CS6 anyway because I'm never paying for Creative Cloud. And I'm increasingly eying the Serif suite, which supports Mavericks.


> Apart from sentiment, does anyone, including Apple, still care about OSX?

Going to hazard a guess that the users of the approximately one hundred million Macs currently in use are among those who still care.

https://techcrunch.com/2018/10/30/there-are-now-100-million-...


> large app developers seem to not be embracing new systems/APIs, software largely isn't being ported

What makes you think that? Microsoft, Adobe, heck, even Valve have ports of their stuff on macOS. They seem to use new systems/APIs when appropriate, but maybe I'm not sure what you're talking about there. Care to elaborate on what you mean?

As for:

> What is the reason for macOS at this point?

Well, it's for people who want something more usable than Linux, but don't want to deal with Windows. Pretty much the same as it's been for decades. It also works very nicely with the iOS devices that many people love, so that helps, too.


The other thing people tend to overlook that Apple has a history of creating new categories of devices. The ipad carved out a niche for itself over time. So did the ipod back in the day.

An ipad pro with the fancy keyboard cover is basically a laptop in all but name and I know a few non technical people who insist that's all they need. The next obvious move is to bridge the gap between the two software ecosystems; which is something they started doing a few years ago by allowing IOS applications to run on Mac OS. Blurring the line further, an ipad with the keyboard attached a bit more and the ability to run Mac OS software (similar to how chrome os can run linux packages) would not be that much of a stretch.

So, an ARM macbook with a touch screen wouldn't necessarily be that different from an ipad with a keyboard. The hard part is basically all software. Imagine an ipad where you can install your favorite adobe tools, home brew, and an IDE. Probably a lot easier to sell than a macbook that doesn't run most of the software you are used to even though technically they aren't that different in terms of hardware and software.

Also, thinking beyond their current device categories; Apple has not done a lot wit VR/AR and gaming centric experiences so far. They've done a little bit via IOS and Apple TV but they've arguably held back a little bit here.


+1, think you have it just right.

Apple is looking for new product lines. The Apple Watch is amazing both in design and manufacture, and also in how very useful it is.

I tend to use my iPad Pro for just about everything but writing code and it can plug into my large USB-C interfaced monitor. Get XCode on it and a data plan with phone service, and it could be a single device to do everything if it is very small and lightweight and can run everything.

Apple has a lot of talent and a ton of money, and I expect them to surprise us, in a good way.


Comments like these always confuse me, because it makes the incorrect assumption that people who love Macs are willing to put up with Windows or Linux, or that Mac users will bother going anywhere else, en masse.

Millions of people are still walking around with millions of 2016-2019 TouchBar MBPs. Every person or business who genuinely cares about the Mac Pro is still going to buy one if they can afford it. None of the things you're talking about actually matter to the average person.

If they go with ARM, Apple will ensure apps that people use will work. You think they're going to scare off Adobe from making apps? Kill off Logic Pro? Stop the web from working? What would Apple possibly do that bothers a normal person enough that they would check out of Apple's ecosystem?


the problem comes in when your apps break and the only way to fix them is to fork out even more money for software upgrades. for music production i have been badly burnt and have moved to hardware synth that will last decades and a mac with logic that will never see another upgrade ever.

> user patience and goodwill, basically political capital with the user base.

Or they can go down the Trojan Horse route and bolt one of their A-series chips as a co-processor to a board with an x86 CPU, begin the process of rewriting the OS and other assets in their control to run on ARM and provide some encouragement for dev's to make the switch over the next 4-5 years.

By having an ARM co-processor lightening the load running system assets they could balance the parts cost by using a previous generation x86 CPU.

They've already offloaded the System Management Controller, image signal processor, audio controller, and SSD controller to the T2 chip (64-bit ARMv8 A10 derivative: https://support.apple.com/en-us/HT208862) - the next step would be integrating T2 into a full blown CPU/SOC. Maybe T2 was the beginning of the Trojan Horse?


If Apple moves to a full ARM line-up, they will need to copy AMD's chiplet strategy.

As Apple understands economies of scale, they will need an architecture that will enable them to cheaply mass-produce millions of chiplets to combine them into low, medium and high-end cpus to suit their different use-cases.

Just like AMD is doing now.

There is no way Apple can create a monolithic Mac Pro-level cpu at low volume for a profit.

It might be easier to just go AMD with the whole Mac line-up. Requires only a little adaptation on OS level and will give users less nuisance trying to run their old apps.

In addition, they will gain performant GPUs as a bonus.


I thought most MacOS applications compile to universal binaries that are something like bytecode, not direct x64 code? Assuming I understand this correctly, most applications should run correctly on ARM without any modification?

Anyway, one of the big appeals of Apple computers and MacOS is that I can run whatever I want in a VM. That little one-off Windows utility that I use once a year that never made it to Mac? Runs fine under VMware. I also hear of plenty of people who dual boot.

What happens when Apple switches to ARM? Are we going to loose dual booting and rich virtualization? At least for me, as much as I love MacOS, virtualization has always been what made it work for me.


You have fat binaries, and a modified version of LLVM bitcode, that is more stable than the official LLVM bitcode.

However so far watchOS is the major consumer from bitcode.


> I thought most MacOS applications compile to universal binaries that are something like bytecode, not direct x64 code?

No, "Universal Binaries" are still native code: https://en.wikipedia.org/wiki/Universal_binary


> Are we going to loose dual booting and rich virtualization?

Virtualization would depend on Apple's chips getting hardware support for this, although you'd only be able to run ARM OSes.


Eh, if Apple releases and ARM based Mac, they will start with notebooks, something similar to the Macbook Air. People should appreciate a laptop with better performance and longer battery life compared to the Intel versions. Most of the software should compile just fine, and if not, maybe a version from the iPad OS will be an ok stopgap. I'm assuming of course that most Mac Apps don't have any Intel assembly code in them. I'm guessing that may be the case for something like Photoshop.

I think you're right and will go a step farther to say that I wouldn't be the least bit surprised if everything without a 'Pro' in the name went to ARM over a relatively short period.

Switching the Pro lines over is a tougher row to hoe, and has much higher expectations from users. I do know that Apple was playing with ARM coprocessing chips to allow for certain types of computations to be performed while in sleep mode. Perhaps the Pros will end up with both.


I find these rumors continually terrifying. Boot Camp and VMWare Fusion is vital to my workflow. If Apple drops x64, they're dropping me as a customer.

Windows runs on ARM, as does Linux.

I'm sure if apple switched entirely to ARM, lots of other software would switch too, and we might even eventually see the end of the x86 era.


It's not Windows itself that people want to run (need to run, actually), it's Windows-based x64 software.

Windows is just one of many pieces of software people would need to run on an ARM-based Mac. Seems pretty clear that this can only work if the ARM-based machines have a pretty decent way of executing x64 binaries.


It seems like they only have 32-bit app support right now based on articles like https://www.laptopmag.com/articles/surface-pro-x-arm-app-com.... I'm curious what the blockers are for x64 software which would obviously be desirable in many ways.

Most likely Intel's patents on different ISA extensions. E.g. every x64 CPU is required to support at least SSE1 and SSE2.

Windows might, but I doubt the majority of Windows software does.


What's your workflow that you need those, but can't get a Windows PC?

A preference for doing a majority of my work in macOS, but some Windows applications that I need to use as well that cannot / will not ever be ported off of Wintel.

Back to the good old days of SoftPC, with a reliance on emulation.

I need Windows in a VM to test browsers. This should be possible on an Arm Mac with Arm Windows inside QEMU?

What makes you think BootCamp wont be able to recompile its software on an ARM machine? If an ESXi equivalent extension is brought in to the chip it is entirely possible to do.

ARM is on the rise within the industry. It's not a trend only for Apple, but Apple might try to move more quickly than other players. My guess is that they'll try to evolve iPad, as an ARM-based portable device, before they look to bring MacBooks over to ARM. This will be a really strong signal that it's time to seriously start porting.

Windows is also pushing more ARM support. As a result, some of the infrastructure e.g. Chromium and Electron is starting to get ARM support. I expect other tools, platforms, and frameworks to start getting ARM support, but it won't be quick, or uniform. This will speed up efforts across the industry. I think they really need to break with the past to get the huge battery life gains that are possible, and Intel/AMD don't have much to offer there.

I don't think any of this will happen overnight, but Apple will look to move things quickly. I'm not sure it's the same situation that Apple found themselves in when they moved off PowerPC. That was an existential issue for their hardware. Now they are on the incumbent platform, so there is time to make a more considered transition. They have iPad as a place to experiment as long as they maintain the general usability and experience of that platform.


Moving personal computing to ARM is going to enslave us, the end users. The temptation for the OEMs to lock everything down and integrate all components into non-extensible hardware is too strong.

Since we're on the subject, how far are we from a truly open source machine?

There already are some: https://puri.sm/products/

Definitely more than a Decade, there is no true GPU equivalent Open Core, There are some Open CPU Cores but these are basic compared to what ARM has on offer.

IBM PC Compatible's """openness""" is a complete fluke, and the conditions that caused it no longer exist. Ultimately we're all going to get dragged along for the ride in whatever direction the masses decide they can put up with.

It was coming since before ARM's rise [1]. But you are right that it is coming. Signed bootloaders are simultaneously fantastic and horrible. I think open hardware initiatives may be a potential way out.

But will we see a ban on anonymity in the coming decades? Only licensed devices could connect to the Internet? It would be similarly fantastic and horrible IMO.

[1] https://boingboing.net/2012/01/10/lockdown.html


Well there are obvious reasons price for one, a A13 Chip might cost apple $25 to produce which is far cheaper than Intel Core i9-9900K which would cost us $479 even for OEM's at bulk prices it would cost them south if $420, even if Apple produces 4 times as dense as chip as A13 which is already quite dense at 8.5 billion transistors it would not cost them what Intel is charging. And Intel has been overcharging for a decade as is evident from AMD's processor pricing. Also A13 is an energy efficient product at 7nm compared to any desktop processor at this time.

I never felt enslaved on the 16 bit single vendor home computers.

By the way, PC "openness" only happened due to IBMs mistake and Compaq's cleverness on how they reverse engineered the PC.


One other question that I haven't seen addressed is, what about GPUs?

Apple makes a GPU for the iPads that seems to compare very favorably to AMD's laptop offerings. I imagine anything written in Metal should "just work," but what about OpenGL? It's now deprecated in MacOS but still sort of on life support, and it seems a little premature to just cut the cord on that.

Or could AMD GPUs work on ARM? That also seems like kind of a heavy lift…


The current AMD GPU in the 16" is the first one in a while (except maybe the Vega series) which has 'acceptable' performance, and there's all kinds of stink about how much extra power/heat the GPU generates when it is in use, even though it's been that way with the MacBook Pro since Apple went to the dual graphics models.

"ARM started as a branch of Acorn Computer in Cambridge, England, with the formation of a joint venture between Acorn, Apple and VLSI Technology. A team of twelve employees produced the design of the first ARM microprocessor between 1983 and 1985."

https://en.wikipedia.org/wiki/ARM_architecture



Now merged hither.

I have a pessimistic view on Apple and Macs based on how this product line has been neglected for years (and continues to be neglected: one case in point is the iMac Pro which is three years old with no updates). It seems like the focus is on "iOSifying" the entire ecosystem (overall, Apple hasn't done much, or what was expected/possible, on the Mac over the past several years anyway).

Due to that pessimism, I'd expect the first ARM based Macs that are released publicly to run inferior Apple software (this is already evident with the Apple apps in the last few years). Many things will be further crippled and lobotomized with even more sandboxing restrictions, and likely no longer resemble what a "Mac app" has meant for a very long time. There have been signs of the "non-nativeness" with the Catalyst apps. Perhaps the Finder would be thrown out in favor of a Files app ported from iOS/iPadOS. What about browser rendering engines? Would third party ones be disallowed, like they are on iOS/iPadOS/tvOS? Safari on Mac has already removed support for WebExtensions last year, leaving users who have more than casual browsing requirements to switch to Firefox (or Brave or Chrome or other browsers). With battery life, thinness and lightweight laptops considered very important by Apple, there could be many things getting axed on the new ARM Macs. Perhaps, the very first ARM based Mac would actually be a new iPad with an integrated keyboard and trackpad.

The article talks about splitting the Mac line, which certainly seems to be a practical possibility. Would we then have macOS that runs on the ARM MacBook, Mac mini and iMac lines and another macOS Pro that would continue to run on the x64 Mac Pro for a much longer time? The stage is already being set to encourage developers to build iOS/iPadOS apps and have them sold as a single bundle on Mac. It's just a matter of pushing them further with the "promise" of a single build running on all platforms (thanks, Catalyst).

Just a crazy thought on emulation of x64 apps: would Apple embrace a "streaming app" emulation, similar to game streaming services where any necessary heavy lifting is done on the server side, with the local ARM processors just handling the minimal (pre-computed) rendering and I/O?


>I have a pessimistic view on Apple and Macs based on how this product line has been neglected for years

Which is unfortunately true. They still have this idea that every update needs to be "meaningful". That would have been true if the Mac were still targeting consumers. But most consumers have moved on to Tablet and Phones. The Mac market is only left with "Prosumers" and Professionals.

It took them 6 years for a Mac Pro replacement and 3 years to admit their Keyboard mistake. And the new magic keyboard may be more reliable, but honestly its shallow typing feels still sucks.

The rate of AirPod, Apple Watch, Services's growth would mean Apple no longer give a damn about the Mac.

While it was all very bitter, they will continue to milk the market, after all there are over 20M iOS developers, that is nearly 20% of Active Mac users. I am not sure if that number is iOS developer account or individual numbers, but it is clear if you want the 1.2B iOS Devices Market you will have to develop on the Mac. And you will have to continue to pay the premium for it.


Apple does not make Macs for developers, they only care about "creative professionals" - which technically should include developers, but their definition only includes people who produce/edit video and pictures.

When was the last time they had a video of developers talking about how great MacOS was at WWDC or one of those media events? It's almost always some celebrity with a british accent or something.


Do note that every Apple engineer works on a Mac.

The iMac Pro was released in December 2017. No real comment on the rest of your thesis, just wanted to correct that little bit.

Thanks for pointing that out. I can't edit and correct my comment now. My intent was to point out the long gaps between hardware updates (depending on which line one follows).

> one case in point is the iMac Pro which is three years old with no updates

I think they updated the GPU at some point. I'm not sure they have great options for upgrading the CPU; the Mac Pro uses broadly the same ones!

> Perhaps the Finder would be thrown out in favor of a Files app ported from iOS/iPadOS.

... wait, why? What does that have to do with ARM?


>> Perhaps the Finder would be thrown out in favor of a Files app ported from iOS/iPadOS.

> ... wait, why? What does that have to do with ARM?

Catalyst is already bringing (or aiming to bring) iOS/iPadOS apps to macOS. Apple's own apps over the last two years have used it (take News as one example). Apple also demonstrates a tendency to throw things out and rewrite them without those being good enough replacements (we've seen that time and again, with for example, iPhoto replaced by Photos, iTunes replaced by a few other apps, etc.)


> Many things will be further crippled and lobotomized with even more sandboxing restrictions

Agree on the first half, strong disagree on the second. We need way more sandboxing restrictions on all operating systems. I have yet to see someone run into sandboxing problems on MacOS (in production) that wasn't doing some evil shit that should be difficult.

As a developer it's a bit annoying. As a user it's why the OS experience sucks less than windows and linux for user applications. The .app and permissions design is fantastic - the experience of developing it not so much, but it could be improved.

I really don't want your app spawning off god knows how many processes doing god knows what with unfettered access to my filesystem and network.


Couldn’t agree more!

> I really don't want your app spawning off god knows how many processes doing god knows what with unfettered access to my filesystem and network.

Look no further than apps like Zoom if you think this isn’t rampant. There are lots more examples of course!


(GP here) I agree, and would like to have more restrictions on what data apps can have access to. The problem I see is how to make it effective while also keeping it easy to use. I don't see a point in an app asking a user (through the system) for access to the Documents folder or Desktop folder and the user granting it. I'd prefer something more granular...something that fits the "sandbox" model with precise and granular access. I realize this may be complex, but there's hardly any point arguing about restrictions if giving access means opening up a really big chunk of data (folder) up for an app.

> I have yet to see someone run into sandboxing problems on MacOS (in production) that wasn't doing some evil shit that should be difficult.

What do you mean here? Photoshop, Lightroom Classic, Illustrator, Visual Studio Code, Adobe Premiere, Logic Studio, Final Cut, Unity, Steam, Chrome, Terminal, iTerm2, Blender, and Cinema 4D are some examples of applications that are not sandboxed. Are those apps all doing evil stuff?


> Macs based on how this product line has been neglected for years (and continues to be neglected

This honestly seems like something out of a script handed to everyone out to rag on Apple, at this point. Kind of like "PCs get viruses all the time!", something that may have been true at one point but no longer applicable, but still used as the rallying call by out-of-touch pitchforkers.

The 16" MacBook Pro (which is fucking amazing and addresses almost all concerns). The Mac Pro. The Pro Display XDR. All of their product lines are going strong, all having received substantial updates, but no, let's pick that one straw.

Most of that comment ("perhaps the Finder would be thrown out in favor of a Files app ported from iOS") reads like rambling scaremongering.


My wish for Apple is for them to spend at least one release cycle fixing their backlog of bugs and focus on polishing their existing features instead of cramming in new ones. Their hardware is fine, but post WWDC 2019 has been a very annoying time to be an Apple user, software-wise.

I can't help it if you choose to see this as a script handed to people. You can look at the history of releases of Mac hardware over the last six years and see for yourself how much neglect has been there. The 16" MacBook Pro does zilch for all the other MacBook* lines with the terrible keyboard that Apple still continues to sell at the same old prices. You'd have a better argument if/when Apple publicly acknowledges the keyboard issues and does a recall/replacement of the devices with the newly designed keyboards for all the people suffering with it even after multiple keyboard replacements. The 16" MacBook Pro does not address the concerns of the owners of the previous MacBook* machines. The presence of the terrible butterfly keyboard in hardware that's being currently sold shows how much Apple cares or is willing to do for customers.

The Mac Pro is a niche product meant for a select (few/several) thousands of people. It's not really something that will make a meaningful impact for most people who have used a Mac and are looking to buy a new Mac.

As for the Finder and Files part, we'll get to know if it's scaremongering or not. Those who have seen and experienced the Catalyst apps from Apple on the Mac know what the future may look like. It is up to Apple to prove them wrong with a much better UX that doesn't throw out what a Mac has meant to users and user experience for the last couple of decades (speaking only from the time of OS X).


What I see happening is the MacBook Air using ARM processors. At least that's where I see things going.

> Why invest in the development of such a high-end chip for Mac Pro’s low volume?

- Control of the whole stack (Apple's stated M.O.)

- Consistent architecture across all their product lines

- If it's roughly twice as energy efficient, as claimed, the floor-standing heat sink known as the Mac Pro could potentially be smaller and lighter again (or they could keep the case the same and increase internal expansion, as they did when they switched the cheesegrater from G5 to Xeon, or they could double the core count with the same cooling)

- Depending on how much cheaper it would be, you could put the same CPU into less expensive Macs (high-core-count-Xeon-level performance in an iMac or Mini would be quite compelling)

How much difference is there in architecture between a Xeon-level chip and an i3-level chip? I assumed most of the added cost of a Xeon was market segmentation, so if you had another way to do that (e.g., by only selling CPUs in complete systems), you'd be able to increase performance on the low end without losing money.


I'd love to see an ARM-based Macbook, especially if it can last 20+ hours on battery, is highly portable, and has great thermal/low heat output. I believe history can repeat itself, as in the past 4 decades, we've seen Macs transition from Motorola 680x0 to PowerPC, and from PowerPC to Intel x86, and from Intel x86 32bit to x64 64bit-only. This brings me back to memories of fat binary and PPC emulation.

>Nuvia [1] is developing a chip to power cloud servers. Williams, who spent nearly a decade at Apple, says he raised the possibility of developing such technology years ago, but the idea was rejected by then-Chief Executive Officer Steve Jobs, who died in 2011, and by Johny Srouji, who’s now Apple’s head of hardware technology, because they thought it would detract from the company’s work on consumer facing technology. Apple continues to hold that position today, according to Williams.

>"In 2010 [2], Williams and [co-worker Jim] Keller raised this idea with Mike Culbert, their former supervisor at Apple. Culbert suggested that they put together a presentation for Steve Jobs, pitching the idea of Apple building a server chip. Williams and Keller did so, and Culbert presented that opportunity to Jobs," the court filing reads.

"Following the meeting, Culbert reported to Williams that Apple would not be pursuing the server chip project because Jobs was only interested in pursuing Apple’s development of consumer-based products."

I dont know why anyone would think Apple is making its Desktop Class CPU for Mac. As a matter of fact, even Jim Killer tries to pitch the idea to Steve Jobs on ARM Server Chip [2] . Apple was very late into its own Datacenter play. I am not sure if the economy of scale would have worked in flavour of Apple if the Server and Mac Pro were using the same chip.

Basically I still dont think the pros are so clear. It is far easier to switch to AMD for cost saving measures and even Apple couldn't be bothered to do that.

[1] https://finance.yahoo.com/news/nuvia-exec-sued-apple-says-18...

[2] https://www.theregister.co.uk/2020/02/14/nuvia_apple_server/


Legal | privacy