That's about the only reason to care as far as I can tell.
I was pretty surprised to see that relatively recent ports like Bioshock Infinite won't be getting a 64-bit update. We had 64-bit games back in 2005 starting with the original Far Cry, so I'm not sure how this could even be a problem nearly 15 years later.
If you have no use for 32-bit than you shouldn't care...
For modern software or many fields, 32bit isn't critical.
However,
Some fields still have some 32-bit based products (or frameworks) which would break on Catalina.
Why would someone use 32bit?
There are closed-sourced products that never released as 64bit. for software developers that's quite rate but still for some fields there are no real alternatives for those products.
so they should keep older versions or run VM.
If the hardware, the operating systems, and the software ecosystem around these products kept on supporting 32-bit the products never had any critical incetive to transition.
Perhpas now they will have that, or other pieces of software will come and fill the void.
I'm still pretty bummed about what this will mean for WINE though, it will mean a lot more work for those devs – or a serious reduction in the amount of software that will work with it.
> I'm still pretty bummed about what this will mean for WINE though
why? all Intel chips currently supports 32bit ISA. You can still have VMs to run 32bit. also IIRC, Wine is X-Window? so you can even attach through SSH-X to it with simple Docker container.
> Wine is X-Window? so you can even attach through SSH-X to it
Absolutely true. I once had to set a client-server Windows-based CRM in a small company. The server had a native Linux version, the client app was windows-only. Client machines ran Windows. Users also wanted to work from home. For some reason (some kind of connection glitches or whatever, I can't remember already) I've chosen to install Xfree on the Ubuntu server (one of command-line-managed VirtualBox VMs on another Ubuntu server actually), set-up the app for direct client-server connection, let users connect to the Linux box over OpenVPN and run the Win32 app with Wine over SSH on their Windows machines :-)
I've fell in love with the X as I have managed to set this up (quite easily, BTW). I certainly don't want Linux to loose such a beautiful ability as it switches to Wayland (I don't worry much about Wine though, all the distributions but Ubuntu keep shipping multiarch libs and it works just fine but I won't update my Mac as it has just 128 GiBs SSD and I have no disk space to spare for a Windows VM).
This idea that vendors should seek to actively break working software simply because it's "old" is repellent to me. I use several 32-bit apps and components regularly because they continue to function better than the alternatives. Many are unlikely to ever be rewritten or replaced because the developers, ecosystems, and business models that made them possible are today either gone or moribund.
And this is how we end up with long lifetimes for software like Internet Explorer, which ends up holding back most people's experience with the web, especially mobile, because a lot of extra scripts/code has to be shipped to maintain compatibility (polyfills, newer features can't be used and thus must be compiled around, etc.), which directly impacts the value being able to be delivered to end users costing more or having less time being able to be allocated to doing so.
More people have to suffer usually if software ends up long lived without being pushed to upgrade when there are newer technologies that improve the experience - it's a heavy price of others you're asking for a lot of times when choices have to be made between compatibility and improvements that often get discounted unreasonably.
> More people have to suffer usually if software ends up long lived without being pushed to upgrade when there are newer technologies that improve the experience - it's a heavy price of others you're asking for a lot of times when choices have to be made between compatibility and improvements that often get discounted unreasonably.
I look at all of the Mac software I've lost and will lose to upgrades over the years, including the offline-only parts of my workflow that held back only myself, and weigh them against this bright new future of low-effort blown-up tablet apps on my desktop that they were sacrificed for. A heavy price certainly has been paid here.
Just think of all the software infrastructure you rely on is probably running COBOL or very, very old tech, you really don't want those to be re-written today. Also, the industry (think CNC controllers) relies on tried and true interfaces that are just unavailable today (USB is utter rubbish in high electrical noise environment) and there is barely replacement either. Not to mention the cost of training blue-collars operators...
Real world is not the web cocoon, even there, switching front-end every 2 years is not acceptable.
disclamer: I know that for a fact as I'm in the process of replacing a CNC control system with modern parts, trust me it's HARD.
If only IE can render a web page, that's not a problem only because IE got old, it was always a problem with bad work on that page, and people were rightly complaining back when it happened.
It's crazy that we can't run old software on vastly more powerful hardware. Imagine if we suddenly couldn't read old books.
I don't believe that people are actively breaking software. It is simply a side-effect of how progress happens in this industry.
Computers are made up of many small parts the same with software, every component comes from different vendors and they are all working asynchrously to each other and adjusting to the changes around them every time they update or make a new component.
Each new change looks like a tiny leap-frogging, but if you don't update or rearchitechture for a long while it becomes an insurmountable to task to update because the whole world has moved away from under you. Not by monumental breaking changes but by thousands small changes that has added up.
There are closed-sourced products that never released as 64bit
Yep. I have a number of 32-bit programs that were never and will never be ported to 64 bits because the author has died, the company is out of business, or other reasons.
Now I will have to buy replacements for these programs just so I can run an OS that gets the latest security updates.
It can also affect hardware.
For example, I have two Apple AirMac routers that can only be configured with a version of Airport Utility that runs on old hardware. I'm lucky to have an old iLamp around that I can use when this is necessary, but when that dies, I have to throw away two perfectly good routers just because Apple chooses not to support them in the current version of Airport Utility.
In all fairness. How many non Mac folks or even Mac folks that don’t listen to ATP would know colloquial terms like the “MacBook One” and the “MacBook Escape”?
The G4 iMac was discontinued in 2004 - over 15 years ago. I’m very much an Apple nerd but I haven’t heard it call that before today.
The other reason I can think of is archival/history, we shouldn't disregard everything we've made in the past just because it doesn't run on latest hardware; they're still a good part of our history that we should learn from.
I just wish Apple would include a free downloadable 32-bit image of macOS Mojave or a specific version that we can always run on top of Apple's Hypervisor framework that'll be maintained forever for use of these 32-bit software in the event we don't find a proper 64-bit version and we don't "violate their EULA" if we buy a new Mac that didn't come with Mojave.
IIRC, you could still run these apps on Snow Leopard in VirtualBox by using Rosetta. It's PITA but I would still say they should offer Snow Leopard image for the purpose of having Rosetta support.
I'm not saying Apple will do anything like this but it would be nice of them to do it.
Maybe a point in time image wouldn't be a bad thing, but what do you mean by "that'll be maintained forever"? That it'll be downloadable, or that Apple should be responsible for security updates, etc, on those legacy OSes?
Rosetta relied on a binary recompilation technology they licensed from Transitive. IBM bought it up and changed the licensing such that it was no longer economical for Apple to ship it, which is why it dropped really suddenly.
Free copies of Snow Leopard would likely cost them a lot in fees to IBM
I am not going to argue with the legitimate claims of inconvenience (or worse) thanks to deprecating old APIs.
But we have a natural experiment of the opposite: Microsoft has bent over backwards to maintain back compatibility. And those old APIs have been the major sources of vulnerabilities, fragility, and maintenance cost. They do occasionally bite the bullet (e.g. old versions of Word are no longer readable) but they go out of their way not to.
I really don't like Windows for my own technical reasons but I have tremendous respect for the amount of effort MS has put in over the years on this front and I sympathize with the consequences.
Apple has a much smaller "surface" (both vulnerability attack surface as well as surface ear of their own code base) which makes many technical decisions much easier.
So either way means a choice; you can't say one is absolutely superior to the other.
I'm not sure this is all that accurate a description of the current state or priorities of either Windows or Mac OS. Microsoft actually did exactly what the other poster is suggesting - use virtualization to provide support for a previous OS (XP mode in Win 7). And that was 10 years ago.
Apple makes previous versions of macOS available to download, and you should be able to cover those into something your virtualization software can boot.
Games. Seriously. You have a lot of old (even new) games that depend on 32-bit binaries/libraries. And that’s why the recent “clash” between Canonical’s Ubuntu and Valve’s Steam[1].
Honestly I can’t recall which gen dropped it, but presumably the moment they reached their EoL for the last non-64 bit device they had no reason to maintain the development cost of having a 32bit OS, maintaining support for communication between 32 and 64bit apps, nor the cost to users (doubling the size of the resident system memory, confusion of some apps running on some device but not others, etc).
And then knowing that if you drop 32bit then you can gain space on your silicon.
What irks me is that devs saw the first 64bit devices come out, and didn’t go “I should recompile my code”. Honestly the fact there are so many new apps made for pc that aren’t 64bit is mind blowing to me.
> Honestly the fact there are so many new apps made for pc that aren’t 64bit is mind blowing to me.
Assuming you mean Windows, there are still a nontrivial number of PCs in the wild that are running a 32-bit OS. It's not huge -- about 1.5% of gaming PCs according to the Steam survey, probably a bit higher for home and office computers -- but either way it's enough that building applications as 64-bit-only is problematic.
Are they 32bit cpus or 32bit windows? I recall windows inexplicably treated(treats?) 32bit and 64bit as being separate purchases for some reason (I assume money, given they limit supported core count according to which type of windows you buy).
But more seriously: why ship 32-bit only software. 64bit has been supported by all cpus intel and amd have shipped for more than a decade now. I had a consumer 64bit Athlon in 2005/6 that I could afford on a student budget. Given the perpetually increasing cpu demands of modern games, that you'd throw away something in the order of 15-20% perf by not support 64bit just seems insane.
Seriously: make 64bit your primary target, and if that hurts 32bit perf, that's those user's problem: they chose not to buy a high perf system.
That's pretty much it. Its really annoying to have to dual boot Windows so that I can play a game like Civ IV. It's sad that a number of somewhat recent games, as well as games that I want to fire up for nostalgia purposes (e.g. Starcraft) don't work anymore.
Having spare time to play Civilization is a luxury (I could only afford this when I was a child, now I only have a humble amount of time left after working, studying, sleeping and working out and I prefer to spend this time with my loved one, outdoors usually). You better just enjoy it and don't waste it on thoughts about annoyances.
As for StarCraft you can run it in VirtualBox, or perhaps even DOSBox with Windows95 might be enough.
Nevertheless I agree people should care about keeping old games playable - these are an important part of human cultural legacy, a fairly serious form of art, as are paintings, literature, etc.
They're already working on getting Starcraft to 64 bit [1], but Starcraft is also interesting in that this is the second time this has happened. As a PowerPC application, it also required Rosetta back in the day, and now will need further recompilation for 64-bit only Macs.
True, Blizzard in particular does a fantastic job supporting their older games. It's still a bummer for other games, such as the beloved (by me) Escape Velocity series, whose publisher is now defunct.
Its also annoying for people with some commercial software. My dad uses an older nonsubscription version of photoshop and some desktop publishing software that I believe will be incompatible with Catalina
My Windows 10 machine runs 32-bit apps alongside 64-bit apps just fine. It's no big deal on the Intel architecture. Why is this such a hard thing for Apple to do?
For the same reason it is a hard thing for Microsoft to run 16bit apps alongside 64bit apps even though Wine shows that it is perfectly possible for the vast majority of them (those that could run under 16bit protected mode Windows 3.x, which is pretty much all of them): it isn't really a matter of it being hard, it is more a matter of them not wanting to bother (or "allocate the resources to do so" in corporatespeak).
MacOS has supported 32-bit apps and 64-bit apps for years. I don’t know why Apple has decided to drop support for 32-bit apps.
I still use Photoshop CS6 because I never signed up to the subscription model so if I upgrade to Catalina I won’t have Photoshop anymore (unless I run it in a VM).
The linked source is directly from Adobe and directly contradicts the claim that "you need 13.1 to launch in 64-bit."
The source says:
> Photoshop CS6 and CC only install a 64-bit version on Mac OS.
Further, it says:
> Photoshop CS5 installs a version that can launch in either 32 bit or 64 bit when you install on a 64-bit version of Mac OS (Mac OS X 10.5 or later).
If CS5 install can launch in 64-bit, it wouldn't make sense for CS6 to be 32-bit only from the first version.
Even further, I know from personal experience that your claim is highly unlikely. This is a screenshot of Photoshop CS5 (12.0.4) running in 64-bit on my machine (High Sierra). Noticed it says "12.0.4 x64."
I am not sure of the source of your information, but it's simply incorrect.
Edit: This is another article directly from Adobe.
It says:
> On Windows, both Adobe Photoshop CS6 and Adobe Photoshop CS6 Extended have the option to run natively in either 32-bit or 64-bit editions. On Macintosh, only a 64-bit edition is available.
I don’t know what to tell you. Every time I launch CS6 on my Mac it tells me that it won’t run on Catalina.
I have no reason to believe that’s not true.
I think it’s quite possible that when Adobe says “64-bit” they’re not saying exactly the same thing that Apple means. I’m no expert with macho binaries and MacOS library linking.
I would imagine it might actually be an indication that some ARM-based Macintosh models are on the horizon since they dropped 32-bit on the ARM side a while back.
Apple killed 32Bit arm in the A11 and as a result gained extra space on the chip for more cores and the Neural Processing module. It’s highly expected that they will eventual build ARM Macs and that can’t happen if 32Bit software is common.
I can't see how the two things might be related. You still have to recompile everything when moving from x86 to Arm, and in general code that has issues moving from 32 bits to 64 it's because it's full of 32 bit assembly (which would be rewritten in order to move to Aarch64) or it does nasty stuff like storing pointers in 32 bit integers (instead of using uintptr_t and the like).
I can't really see any extra complexity arising from doing x86?Aarch64 instead of x86_64?Aarch64. I think they are doing step to force people to rebuild old apps, so that
1. They might consider providing Apple a Bitcode™ representation that they can use in the future when switching to Aarch64;
2. They get rid of all those old nasty Carbon apps;
3. They kill the viability of older third party tools, such as Wine on macOS (which will be either forced to switch back on using X11 + providing 32 bit Darwin libraries, or writing an insane amount of thunks), further pushing people towards either rewriting old apps with SwiftUI or Electron (both alternatives that would make the ARM switch quick and easy)
If you’ve ever followed Raymond Chen’s Microsoft Blog, you would get an idea of the hacks upon hacks upon hacks that it takes to keep backwards compatibility in Windows. It’s also well known how slow the Windows release cycle is because Microsoft won’t break compatibility.
Windows is a behemoth compared to MacOS in terms of processor requirements and memory. There is a reason that Microsoft couldn’t get Windows running well on low end and low power ARM processors.
But how far back would you like Apple to go with backwards compatibility? 68K programs? PPC programs? Keep porting Carbon APIs to newer platforms forever?
Right, imagine simplifying and shedding all of that technical debt. You can move faster, reduce the QA matrix, and cut down the number of permutations that developers need to consider and support.
Apple's disregard for backwards compatibility is really the main reason i avoid macOS and iOS nowadays. I've got an iMac back in 2009 and an iPod Touch a year after it came out and with every subsequent update more and more stuff broke to the point where most of the software i had stopped working - including a some software i paid for (while you can say that the fault also lies with the developers, you cannot expect developers to update their software for eternity for free). While Windows software doesn't work 100% of the time, it works often enough to be practically "100%" - unless a program digs deep into the system, chances are it will work either out of the box or with some tinkering.
My only fear for the future is that since Microsoft doesn't seem to see Windows as primarily a product to sell to people anymore, they dampen their backwards compatibility efforts. I hope that doesn't happen.
With iOS you have a real argument, in that the advantages that come from being able to move the platform along more quickly and deploy resources much more efficiently weigh directly against loss of old software that is no longer maintained (games in particular, a lot of real gems didn't make it from 32-bit).
With macOS however, and for that matter Windows or Linux or anything else on the PC, I think it's arguable that the last 10-15 years of improvements and universality of virtualization (and emulation for that matter) have changed the calculus completely. Even with Windows you say
>Windows software doesn't work 100% of the time
But in my experience it does work 100% of the time... in my Win98 or Win2000 or WinXP VMs which are trivial to spin up at will. And I also have macOS (previous Mac OS X or OS X) VMs going back to 10.6 Server, the first support virtualizable version. macOS is more highly dependent on hardware acceleration, but even there the advent of PCI passthrough means it can be solved at negligible performance penalty. And for OS that are even older, there is now such a delta in raw CPU power that emulation can cover it.
So I think the balance has shifted in terms of the value of maintaining BC in a PC OS. It's not useless by any means, nobody wants software breaking every single year. But compared to even 5 years ago I now lean towards it being fine to start leaning more on virtualization beyond the 5-7 year mark, and certainly the 8-10 year mark, whenever there are real benefits to moving forward.
Virtualized environments are akin to using a separate and worse computer, just with the same keyboard, mouse and monitor. You cannot, e.g., copy/paste an image from a host app to a native app and back (or if you can it isn't as seamless).
> ...you cannot expect developers to update their software for eternity for free
You already know why Apple doesn’t do long backwards compatibility. You were talking about developers of your apps, but Apple is a developer too, and faces the same kind of issues.
These days (unlike the old days) MacOS gets regular, free updates. There’s going to be a cost to that somewhere. Macs and iPhones are probably expensive enough as it is (those hardware sales are what’s paying for OS and platform development).
I do not see how not expecting some random developer to update their software for eternity for free has anything to do with Apple's backwards compatibility.
Developers in general do not update their software for eternity just to keep up with OS breakage, at best at some point they'll ask you to move to a newer version (which you may or may not like - it is far from rare for software to devolve in subsequent versions, especially with paid software that often feels the need to amass features), at worst (and yet very common) the software will be abandoned and you'll lose it.
Apple has the power to not break those programs, not the individual developers. If Apple breaks some API or other functionality that 1000 programs rely on then 1000 developers and their users will need to waste time on fixing the breakage. If Apple does not break some API or other functionality then nobody needs to do anything on that front. It is all at the hands of Apple, not anyone else.
And BTW, i think my previous message was misunderstood, as i wrote in another reply, i wouldn't mind at all to have paid upgrades if that would guarantee backwards compatibility - i know it is hard and time consuming. I still consider it extremely valuable for an OS.
> You want to take advantage of apples free os and app upgrades but still run 13 year old software without paying for upgrades.
I never mentioned price, i wouldn't care if i had to buy the macOS upgrades - and in fact that is what i did when they were paid. My issue is with backwards compatibility.
I'm not sure if with "without paying for upgrades" you refer to OS or app upgrades but i'll guess the former (which i explain in the previous paragraph) as the latter makes no sense considering the "13 year old" part.
> Why upgrade your Mac? Just keep the one that is working with the software you have for 13 years
Because i also want to run newer software that requires functionality introduced in later versions.
> You want the latest, bug fixed, secure software and OS . The Apple side is free but you will need to sort out the application side .
I'd rather pay the Apple side and be able to use the software i want to use.
> BTW ms OS upgrades are not free. Me thinks you want too much
Me thinks you misunderstood my message. I don't care if OS upgrades were paid, i care about backwards compatibility. If anything at the very end of the message you replied to i mention that my fear about Windows is Microsoft not seeing the OS as a primary source of income and hence dampening their backwards compatibility efforts.
To be clear: i have zero issues with paying for new versions of the OS if it remains backwards compatible. OS price wasn't a factor in my message at all (outside the fear expressed at the end that was targeted mainly to Microsoft).
32-bit macOS uses the older fragile ObjC runtime. Meaning no macOS frameworks present on 32-bit macOS could adopt newer features. In fact they couldn’t even add new ivars because that changes the class layout and would be ABI-breaking.
That’s beside the increased disk space and memory used having two copies of everything.
32 bit code is stuck with the older ObjC runtime, which can’t support a bunch of new language features and suffers from from fragile ABI issues which make changing the internals of classes extremely problematic.
It’s not simple to fix (“just upgrade the runtime to support the new features”) because the newer runtime trades on the fact that a bunch of unused space in 64 bit pointers can be exploited for tagging, pointerless classes and a bunch of other things. And if they did fix the fragile ivar problem on the 32bit runtime, you’d need to recompile to use the new way of dynamically calculating strict offsets to member fields — and these apps are mostly dying with 32bit precisely because no one is recompiling them.
It’s basically dead-end code that is difficult to maintain and which increasingly requires the rest of the OS to stagnate just to keep it running.
Also: the Carbon API set is only supported on 32-bit, and removing 32-bit means all of Carbon can go away. Carbon is a massive API set, with one or sometimes two full legacy APIs (often dating back to the 90s) for everything.
Drawing (QuickDraw), multimedia (QuickTime), open/save dialogs (NavServices), fonts/text rendering (Font Manager, ATSUI), windows/menus/dialogs (both HIToolbox and the classic Managers), file I/O, printing, and on and on. Keeping all this stuff functional (to support old apps) while evolving the system underneath is a huge burden which will be lifted from 10.15 on.
Another factor is if they eventually move to their ARM chips in Macs they need to deal with the fact that The Ax CPUs lack 32Bit support and are 64Bit only.
'Doom 3', 'Quake 4', 'Star Wars: Jedi Academy', 'Bioshock: Infinite'...this reads like a list of my favorite MacOS games. I know I won't be upgrading, between this and the way it will kill some of my favorite legacy audio plugins.
reply