Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

My understanding is that iCloud reports orders of magnitudes less CSAM than other cloud services at a similar scale. My guess is that apple wanted a way to report the CSAM they are currently storing without having to decrypt/inspect every person's personal data (which necessarily opens vectors for e.g. rogue employees doing bad things with people's personal data)

Hence why this approach was stated as a privacy win by Apple. They catch the CSAM and they don't have to look at your photos stored online.



sort by: page size:

"Apple's privacy measures, such as not scanning your Cloud photos, is what helps enable CSAM."

this is what baffles me.

I'm sure a good number of iphone customers have their photos automatically uploaded to icloud.

For all intents and purposes, the end result is the same for most people. Apple is scanning your photos, and doing so on your phone feels much more intrusive.


That's why the CSAM scanner is on your device. It computes the hashes in place on then unencrypted images before uploading encrypted copies to iCloud.

That's why from some perspectives it is a net privacy win versus Google/Microsoft's similar tools that require them to have decryption backdoor keys on their clouds to process these CSAM requests and other FBI/TLA/et al warrants. Apple is saying they don't have backdoor keys at all on iCloud and if they are forced to do CSAM scanning it has to be on device, without leaving the device to have access to the unencrypted images. Only if you hit the reporting threshold (supposedly 30+ hash violations) would it also encrypt copies to a reporting database on iCloud (and again only if you were uploading those photos to iCloud in the first place).


Apple has through side channels leaked iCloud is the largest open host of CSAM among big tech. It's the only large provider that hosts images that doesn't automatically scan. The only difference is Apple wants to do it while leaving your photos in the cloud encrypted. This isn't rational, it's an anti-Apple culture war position.

Scanning iCloud is far less privacy preserving. That means Apple “knows” about all of your photos.

With what Apple is proposing, Apple knows nothing about any of your photos unless you reach a threshold of about 30 CSAM images through the use of cryptographic safety vouchers uploaded with each image.

That is, if through cryptography, about 30 of the images can be proven to be a subset of a combination of CSAM images from multiple databases is the only time when anyone other than the user can learn anything about the photos.


Not sure why you're downvoted (too many facts in one post I guess lol), but you're right. I wanted to add that Apple has already been doing CSAM scanning on photos uploaded to iCloud for years. They moved it to the client instead (I think in preparation in make iCloud photos e2e encrypted).

Turn off iCloud photos and it turns off CSAM.


What Apple forgot to market was the fact that it was only scanning photos that were destined for upload to iCloud Photos, which most people turn off because they don't want to pay more than $1/month for iCloud storage (or $4 if you happen to have a family).

Thankfully, they went forward with "encrypt everything in iCloud E2E"[0] despite not having a way to detect CSAM anymore, probably much to law enforcement's chagrin.

0: https://support.apple.com/en-us/108756


It's worth looking into the privacy concerns that Apple was actually trying to fix here:

1. Photos (and other documents) are currently uploaded to iCloud un-encrypted

2. These photos are already scanned for CSAM after upload

3. Because the photos are not encrypted, at any point, any government can file a court order to release those photos.

4. The court order can require Apply to not notify the user, or the public in general.

5. [Speculation] Such orders might already exist and be somewhat common within Apple

Apple wanted to fix this and introduce end-to-end encryption on all photos uploaded to iCloud, but scanning for CSAM was non-negotiable (due to internal or external politics?). They must keep doing it.

So they implemented this big mess of a workaround to scan for CSAM before upload and attach a cert with a decryption key only to photos that match so that they could later human verify once a user had enough matches and weed out false positives (which Apple acknowledge will happen) before notifying law enforcement.

Because of the direction that Apple came from, and how much effort they put into designing this system to maximize privacy, they saw this solution as a large privacy win over the existing situation. It's not surprising Apple might have been blinded to the privacy concerns of doing AI scanning of photos on user devices, they were looking at it from the wrong angle.


Hopefully so they can remove their current ability to decrypt user photos for whatever reason they want. The current state is they can decrypt any user photos on iCloud. Doing client side scanning and this CSAM detection implementation could allow them to remove their ability to decrypt EXCEPT in very specific situations.

It's not true end-to-end encryption since in some cases the content can be decrypted without the user key but it's significantly closer than what they have today.

That being said I don't know if that is their plan or not, but it is a plausible reason to make this change.


Yes, of course that is true. I use iCloud Photos and find this terribly creepy. If Apple must scan my photos, I'd rather they do it on their servers.

I could maybe understand the new implementation if Apple had announced they'd also be enabling E2E encryption of everything in iCloud, and explained it as "this is the only way we can prevent CSAM from being stored on our servers."


This entire analysis is based off a flawed assumption - that it’s a matter of proactiveness and not a nature of platform differences that causes the reported CSAM numbers disparity.

A social networking site/platform is one where previously created media is shared. iMessage pictures are not uploaded to iCloud Photos by default (they are part of the usually E2E content with the standard online backup caveat), only photo albums or photo rolls are - and those are overwhelmingly first or second party just-created content.

This means that if comparing against known CSAM hashes, it’s extremely likely for Apple to find orders of magnitude less content in the first place. The only thing that their system can catch is first-party images that end up being distributed and registered with the various hash databases.

Regardless of whether we are talking CSAM or anything else, it is the norm for the amount of content “created” to be significantly (as in several orders of magnitude) less than the material consumed.

I don’t think scanning on the device vs scanning in the cloud is going to change any of that.


It seemed clear they were making moves in this direction back when their announcement about on device hash checking for CSAM prior to iCloud photos backup was made. That announcement only made sense in a world where they wanted to enable end to end encryption for photos. It's cool to see them do this, and see them also extend it to Messages too (surprising imo).

--

> The apple policy was likely about coming up with a way to enable encrypted photos on iCloud while still having some privacy preserving form of CSAM detection. Since it was only enabled when iCloud photos was enabled it was better for privacy on net than the status quo (unencrypted iCloud photos that are accessible to apple and scanned anyway).

https://news.ycombinator.com/item?id=30297272


Apple is scanning files locally before they are uploaded to iCloud in order to avoid storing unencrypted photos within iCloud but still discovering CSAM. All the other storage providers already scan all the images uploaded on their servers. I guess you can decide which is better. Here is Google's report on it:

https://transparencyreport.google.com/child-sexual-abuse-mat...


"Some metadata and usage information stored in iCloud remains under standard data protection, even when Advanced Data Protection is enabled. For example, dates and times when a file or object was modified are used to sort your information, and checksums of file and photo data are used to help Apple de-duplicate and optimize your iCloud and device storage..."

Photo checksums can't be e2e encrypted huh? They reported today they abandoned their plans to do CSAM scanning on people's devices[1] and connecting the dots it seems like they wont need to since they can just do it in the cloud.

[1] https://www.wired.com/story/apple-photo-scanning-csam-commun...


The most people have misunderstood how this system actually worked. I read every paper carefully, and this was very innovative system to protect your privacy IF we compare to existing CSAM solutions. Of course the best option is to not scan at all, but that is not really option.

Only those pictures which were about to be uploaded iCloud, would have been scanned, and Apple would get information about image contents only if it is flagged. The phone is blackbox and it scans your files anyway all the time, sending metadata to Apple e.g. because of the spotlight feature and photo albums, or simply syncing your data to cloud. There is massive AI behind that spotlight feature. Nothing would have changed, just the different kind of information would have been flagged, but this time encrypted.

The major improvement was E2EE like system for Photos. Currently they are not E2E encrypted in Cloud. They are plaintext for Apple. iOS 15 beta had also encryption for backups, but it never reached public reach after CSAM was delayed. So we lost yet another feature which would have increased the privacy. "But it happens in my device", is not really valid argument since most people don't understand what happens in their devices in the first hand. Even AV engines sends all your data to cloud, and you can't opt out in most of the cases and it is for every file.


Presumably, it’s done this way so they can say computers other than your personal device do not scan photos and “look” at decrypted and potentially innocent photos. And technically the original image is never decrypted in iCloud by Apple - if 30 images are flagged they are then able to decrypt the CSAM scan meta data which contains resized thumbnails, for confirmation.

In summary, I’m guessing they tried to invent a way where their server software never has to decrypt and analyze original photos, so they stay encrypted at rest.


I agree with you in that I do not understand why anybody doing something illegal would upload related data to a cloud storage.

But if nobody would import CSAM into their icloud library why do all the pictures need to be scanned in the first place? I would imagine anybody doing major illegal stuff being informed about important measures in order to not be caught.


It's important not to conflate the new features. CSAM uses hashes of known photos and is only run on photos going to iCloud (turning off iCloud turns off CSAM). Photos sent to iCloud have been checked against CSAM for years on the server. The change here is moving it from server to client (which I hope is to make iCloud photos E2E encrypted).

Completely agree with your second point. All the 'what ifs' have existed forever. Either iOS users trust Apple will only do what stated or they don't. Nothing has changed.


I want to point out that people who use iPhones don't have any obligation to use Apple's iCloud services.

As the technical summary [1] explains, this scanning process only affects data that is being uploaded to iCloud – which, by the way, never offered end-to-end encryption of photos in the first place. Apple is fully within reason to prevent CSAM content from reaching their servers, and if you read the technical paper they seem to have implemented the technique in a way that goes out of its way to avoid collecting data about you.

Instead of using iCloud, you can have your photos automatically upload to any other cloud photo service including self-hosted solutions like Nextcloud Photos. At no point does iPhone ownership require that you use Apple's services beyond the App Store. You can completely back up and restore your phone and sync a variety of content entirely with local storage via a cable or WiFi just like it was an iPod. Or, you can use a wide variety of non-Apple apps for storing and accessing content.

You can enable/disable each iCloud service individually.

I know the author really has some decent points here, and certainly Apple and Google's smartphone duopoly needs more customer protections, but the article as a whole feels ruined by a bunch of whining about a single customer-specific issue (Something about migrating a developer account? I have no idea what he's talking about.), no more useful to the rest of us than a blog post complaining about a restaurant server who forgot to bring him a side of BBQ sauce.

[1] https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...


Because the idea is that the iCloud data would be encrypted so their servers couldn’t scan it. With the plan being they would do on device scanning of photos that were marked as being stored on iCloud.

It’s objectively better than what google does but I’m glad we somehow ended up with no scanning at all.

next

Legal | privacy