Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

The author of that article way off. First, his evidence is that an email attachment triggered a report, and we already know email is the only data stored unencrypted in iCloud. Email scanning is the most plausible explanation for the ToS update.

If Apple were scanning iCloud Photos, one would expect there to be hundreds of thousands if not millions of reports to NCMEC (last year Facebook reported 20 million). Reporting is compulsory, the information is public, and Apple reported 265 last year. Do the math. Apple is not scanning Photos.



sort by: page size:

If they weren't doing any scanning why would they find any to report? The data is encrypted as rest so... why would they find any to report. This clearly doesn't include search requests [1].

iCloud has perhaps 25% of the users of Facebook. Of that 25% it's not clear how many actively use the platform of backups/photos. iCloud is not a platform for sharing content like Facebook. So how many reports should we expect to see from Apple? It's unclear to me.

So, I'm not saying the number isn't suspiciously low. But it doesn't really clarify what's going on to me...

[1] https://www.apple.com/legal/transparency/pdf/requests-2018-H...


OPs article cites the number of NCMEC reports from Apple vs. other tech giants (200 something vs 20 million something at Facebook). It is all a bit confusing and I expect most of us are learning more about iCloud than we ever planned on; Apple has been able to decrypt our iCloud photos all along but those reporting figures make it pretty clear that they haven’t been doing so en masse. This is a big shift.

This article speculates that that's because Apple is not scanning on iCloud to respect their privacy policy: https://www.hackerfactor.com/blog/index.php?/archives/929-On...

Apple's report count to the NCMEC is really low so it's probably true that they are not scanning on iCloud unless they receive a warrant.


This article speculates that that's because Apple is not scanning on iCloud to respect their privacy policy: https://www.hackerfactor.com/blog/index.php?/archives/929-On...

Apple's report count to the NCMEC is really low so it's probably true that they are not scanning on iCloud unless they receive a warrant.


As a reminder for context, Apple already did do a server side check for iCloud photos [1]. I think that's missing from the debate, what's changing is moving the check to on device which many object to (and I and others have written enough about that).

To add to this though : I think they built an extremely complex - and hopefully perfectly robust - system to make the check on device. On top of the other issues, it's massively important said process is not exploitable.

The content of the NCMEC database is not public, one can argue for good reason, and if that effort to move on device lead to the database being leaked (and thus bad actors being able to "mark" content that they shouldn't share), that would be incredibly counter productive. Maybe only knowing that the database was updated is too much information to leak about this ?

I think that's an additional concern to the ones already expressed many times, and I sincerly wish Apple had kept the checks server side even more now, as I only see downside to not doing so.

[1] : https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-...


> Do you have any reasons to think that they're not scanning photos on their servers? AFAIK every major storage cloud does that, including Apple.

Apple already scans iCloud Mail for CSAM, but not iCloud Photos https://9to5mac.com/2021/08/23/apple-scans-icloud-mail-for-c...


Wait if this is really true, why are people freaking out? Wasn't Apple already scanning photos uploaded to iCloud? And now it just happens on-device?

> To help address this, new technology in iOS and iPadOS* will allow Apple to detect known CSAM images stored in iCloud Photos.

After reading OP, my understanding had been that this update will cause _all_ photos on Apple devices to be scanned. However, the above quote from Apple's statement seems to indicate that only photos uploaded to iCloud Photos are scanned (even though they are technically scanned offline).

This doesn't invalidate any of the points people are making, but it does seem the update does not directly affect those of us who never stored our photos “in the cloud”.


> 1. Photos (and other documents) are currently uploaded to iCloud un-encrypted

Incorrect. Photos and documents are encrypted both in transit and at rest. They are not E2EE, however - i.e. Apple has decryption keys.

> 2. These photos are already scanned for CSAM after upload

Apple does no scanning in iCloud. All of your data (except email) is stored encrypted and it cannot be scanned. Apple escrows the decryption keys, which are provided only to comply with legal requests for user data.

> 3. Because the photos are not encrypted, at any point, any government can file a court order to release those photos.

The second part is true, but the first is not.

> 4. The court order can require Apply to not notify the user, or the public in general.

This is simply the law. If there is no non-disclosure order, it's Apple's policy to notify.

iCloud encryption docs: https://support.apple.com/en-us/HT202303

Legal process docs: https://www.apple.com/legal/privacy/law-enforcement-guidelin...


> The system was designed to work only with images as they were being uploaded to iCloud

Not iCloud, since that would imply it was scanning entire device backups stored in iCloud.

It was intended to scan just the photos that you uploaded to their publicly accessible Photo sharing website, iCloud Photos, in a way that even Apple wouldn't have access to the results unless you uploaded more than 100 photos that matched known kiddie porn.


>There were some reports that Apple has been scanning iCloud Photos server-side already; apparently that hasn't been the case ever.

This article from Forbes notes otherwise, citing a warrant filed in Seattle for a case.

https://www.forbes.com/sites/thomasbrewster/2020/02/11/how-a...

I get the sense the interview is saying that they haven't scanned photos - like looking at actual contents - but they've always checked hashes. It is super confusing though, and I'd like to see Apple clarify this further.


It's worth looking into the privacy concerns that Apple was actually trying to fix here:

1. Photos (and other documents) are currently uploaded to iCloud un-encrypted

2. These photos are already scanned for CSAM after upload

3. Because the photos are not encrypted, at any point, any government can file a court order to release those photos.

4. The court order can require Apply to not notify the user, or the public in general.

5. [Speculation] Such orders might already exist and be somewhat common within Apple

Apple wanted to fix this and introduce end-to-end encryption on all photos uploaded to iCloud, but scanning for CSAM was non-negotiable (due to internal or external politics?). They must keep doing it.

So they implemented this big mess of a workaround to scan for CSAM before upload and attach a cert with a decryption key only to photos that match so that they could later human verify once a user had enough matches and weed out false positives (which Apple acknowledge will happen) before notifying law enforcement.

Because of the direction that Apple came from, and how much effort they put into designing this system to maximize privacy, they saw this solution as a large privacy win over the existing situation. It's not surprising Apple might have been blinded to the privacy concerns of doing AI scanning of photos on user devices, they were looking at it from the wrong angle.


> Do they mean they haven't been doing it for iCloud Photos, but were arbitrarily doing it for other parts of iCloud?

I read elsewhere they they scanned Mail but not Photos.


> The voucher generation is actually exactly what enables us not to have to begin processing all users’ content on our servers, which we’ve never done for iCloud Photos.

I really dislike this statement. It's likely designed to be "technically true". But it's been reported elsewhere that they do scan iCloud content:

https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-...

Perhaps they scan as the data is being ingested. Perhaps it's scanned on a third party server. But it seems clear that it is being scanned.


Glad to see someone mentioning this. It's a strange about-face for them, as I thought they still scanned iCloud

> no way for anyone to independently verify that the scope of the content being scanned has not been secretly increased.

This seems like the easiest thing out of the lot to verify.

The way that this system is designed to work is that when uploading to iCloud Photos, images have a safety voucher attached to them.

If Apple secretly expanded this to scan more than just iCloud Photos, they would have to either a) upload all the extra photos, b) add a new mechanism to upload just the vouchers, or c) upload “fake” photos to iCloud Photos with the extra vouchers attached.

None of these seem particularly easy to disguise.

Your concern is completely understandable if you are starting from the premise that Apple are scanning photos then uploading matches. I think that’s how a lot of people are assuming this works, but that’s not correct. Apple designed the system in a very different way that is integrated into the iCloud upload process, and that design makes it difficult to expand the scope beyond iCloud Photos surreptitiously.

Could Apple build a system to secretly exfiltrate information from your phone? Of course. They could have done so since the first iPhone was released in 2007. But this design that they are actually using is an awful design if that’s what they wanted to do. All of their efforts on this seem to be pointed in the exact opposite direction.


Yeah, all they need to do is follow suit with what everybody else is doing and scan once images are uploaded onto iCloud.

Scan every single file. I don't care. Because once in iCloud, files are sitting on Apple's server and hard drives. I don't have much expectation that those files are 100% private.

They're completely missing the point.


My understanding is that iCloud reports orders of magnitudes less CSAM than other cloud services at a similar scale. My guess is that apple wanted a way to report the CSAM they are currently storing without having to decrypt/inspect every person's personal data (which necessarily opens vectors for e.g. rogue employees doing bad things with people's personal data)

Hence why this approach was stated as a privacy win by Apple. They catch the CSAM and they don't have to look at your photos stored online.


This is a lie. Did you try reading the actual announcement by Apple? It's right here: https://www.apple.com/child-safety/.

- The scanning will be performed only if photos are to be uploaded to iCloud.

- The database will be encrypted multiple times in a way that it can't be clearly read.

- There's not notification to Apple in any way in case of matches.

- Instead, each match result is again encrypted in a way that is inaccessible to Apple and uploaded together with photo

- If there a lot of positive matches, they eventually will become able to decrypt it. That's when they will do manual check, lock the account if it is correct and notify authorities.

next

Legal | privacy