In some ways, this is Apple pushing things forward... but in other ways, it's just them catching up. Until this update, there was no way to get 10-bit displays working with Apple products at all, even third-party monitors that were 10-bit capable. The OS simply wouldn't support 10-bit display. I know you could get it in Windows, and almost certainly in Linux with the correct voodoo.
My post guys wanted to stay on the Mac platform, so we dealt with 8-bit displays (color wasn't a huge part of our workflow -- but 10 bit would've been nice). But it was one of those head-smacking moments for a platform that was supposedly media-friendly.
Note, there's some confusion in the comments so far -- 10 bit is a professional display, and is extremely uncommon. Pro cameras and software can usually handle higher bit depths -- 16 for DSLRs, 10-12+ for pro video cameras -- but you're probably not seeing it unless you've carefully set up your computer. Those bits are still useful: they're the raw material for generating your 8-bit final image, so you don't get banding when adjusting color or exposure. And they're essential for containing a wider range of possible values, letting pro cameras represent a wider dynamic range than is possible in 8 bit systems.
And 8 bits is more typical -- it's literally baked into many file formats, like JPEG. Some crappy screens on consumer electronics can't even represent the full 8 bits per channel; 6 bits + dithering is sadly really common, even in screens that advertise themselves as 8/24-bit. Also, color depth can be reported both in bits-per-channel and total bit depth; a 10 bit-per-channel display is 30 bits of color information per RGB pixel, 10 each for red, green, and blue.
(Some of the confusion is probably intentional; I had a hardware partner brag about their '15 bit display,' which sounded very weird to me... until I realized it was really 5 bits per channel, which is roughly bad-ATM-screen quality.)
Do you mean 10 bits per channel? (i.e. 30-bit RGB.) The 10" retina iPads have routinely scored very well as 24-bit displays (not the best there are, but certainly they seem acceptable to a lot of professional photographers).
Please note that the 6-bit image is not what will be present on an Apple screen. The difference between 8-bit and dithered 6-bit is difficult to notice for most people.
Was just quoting the author. But also it appears that Apple disagrees with you, since they seem to have turned off sub pixel AA five years ago in macOS Mojave. See [1]. Indeed when I screen-captured and scaled up the non-western script in the article, there was no sub pixel AA on my Mac.
> Actually I am using a Retina display. I’m still noticing the difference very much.
I'm curious why you think you notice a difference? Macs have never had subpixel rendering for Retina, so if I'm understanding you correctly, it's not something you could know from experience. Why do you think it would make a difference you could notice if they did?
(And of course, Retina rendering is far sharper than lo-dpi subpixel rendering.)
I suppose you might be able to take an image of subpixel-rendered text e.g. at 24 pt, scale the bitmap to 50% width/height on a Retina display, and put it next to the same text at 12 pt, and see if you could tell the difference. Although you'd need to be careful to get the same precise vertical baseline alignment.
One cool thing about the way that Apple do drawing is that (at least on iOS, I think on OS X too (CG* APIs) - I'm making the assumption this is a PostScript/Display PostScript holdover) the drawing APIs use floats for positioning instead of integers. So a retina pixel is just a whole number pixel+0.5. This was a pain before retina screens appeared, since anything not positioned on a whole number would render blurry. But then retina displays appeared and the reason for it clicked.
> Apple didn’t release devices with Retina displays until they could be confident that such devices were at least no worse than non-Retina devices in pretty much any regard
That's... not how it works. More pixels means more CPU, GPU and memory cycles to fill them and more DAC cycles to read them out and scan them to the display. That's just fundamental.
> The 12.9-inch Liquid Retina XDR display has an IPS LCD panel supporting a resolution of 2732x2048 pixels for a total of 5.6 million pixels with 264 pixels per inch. [1]
"Liquid Retina XDR" is just a high end LCD. Micro LED isn't yet on anything they sell. All apple devices phone-size and smaller currently use OLED, and everything larger uses IPS LCD.
At the minimum, it's faulty marketing on Apple's product. They shouldn't state that it supports millions of colors when it actually doesn't. Just like the CRT monitors that are 17" but only 15" viewable... sometimes that stuff doesn't change until someone files a lawsuit.
It's all in apple's implementation of retina in OS X. The display is running at its native 1600p but runs applications at a "logical" 800p resolution. It's just as if points and pixels suddenly had different meanings.
And the text for the image is as follows:
reply