Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

>It's kind of a funny situation where in Apple being more transparent than any other implementer of this same process.

It isn't the same process, it's an on-device scanning process and Apple is the first to implement this. Had Apple said they were scanning iCloud directly nobody bat an eye (I, for one, assumed they already did).



sort by: page size:

> Honestly I would have been fine with Apple scanning my all photos after they are uploaded to iCloud

Apple has already been doing exactly this for years. Now they are going to check the photos right before uploading to iCloud which is actually more privacy friendly than they do now. It also lets Apple turn on e2e for iCloud photos if they want.

I understand the 'what if' and slippery slope arguments, but wow, there is so much misunderstanding in this thread. Apple makes the OS and doesn't need a fancy system to scan all the files on the device if that's what they want to do.

I highly suggest reading this: https://www.apple.com/child-safety/ and the associated PDFs. Apple PR hosed this up by putting 3 distinct features on one page that people seem to be conflating.


> "The fact that you were intending to send that data to iCloud is almost irellevant to the entire discussion since apple has clearly built a system for scanning your files on your device"

This is a very dishonest take. They already have a system which runs on your device and processes all photos in the photo library, tagging categories like faces and pets and food and so on. They could have added this new system into that one with much less engineering and design effort using a system which scans all your files. They didn't, they went to a lot of extra effort not to. They already have system services e.g. location awareness or device monitoring and diagnostics, they could have built this new system as something like which would run 24/7 and access everything and be a simpler design, and they didn't.

What they did was put it into the iCloud upload feature to be matched by additional code on the iCloud server side. Something which took them more time, more effort, more complexity, and gave the whole thing less power and less flexibility.

If this system is as dreadful as all the complaints say, it should be plenty of ways to show that without having to make stuff up. If describing it accurately isn't scary enough, maybe it's not as scary as y'all want people to think.


> I also don't understand why Apple is pushing the CSAM scanning for iCloud into the device

Because it's a pretense for further surveillance.


>The use of the detection algorithm is for iCloud only today.

Yes, which is what I was saying in my comment. If or when it comes to Apple changing this then I would agree it's a battle worth fighting, but that is not what is happening here and that is not what I was correcting in this article itself.

>The main issue is that the wall has been breached:

The wall was breached when we opted to run proprietary OS systems. You have zero clue what is going on in that OS and whether it's reporting; you have to trust the vendor on some level and Apple is being fairly transparent here. I would be far more worried if they did this without saying anything at all.


> I don’t think anybody has a problem with iCloud scanning

The large number of people here who want iCloud to be E2E have a problem with it.

> (don’t they do it already?).

Apparently not for files and photos. They may well do it for email.


>All of which has been common for a long time

No, this has never been done before.

>You where already sending your unencrypted phots to Apple

No, I wasn't. The whole point here is that Apple is not scanning server side, they've built the functionality to scan device side. You need never use iCloud in any way whatsoever in order for Apple's new scanning tech to be used against you. That is a major difference.

>Apple and all other service providers are required by US and most other countries laws to do searches on their servers.

No, they are not, even if that was the new thing Apple is doing which it isn't. If a company builds server-side scanning, then they may be required to fulfill certain requirements. But companies are not required to actually do that in the first place even if many choose to do so. Apple already did scan uploaded photos and voluntarily chose not to have E2EE for iCloud data in order to please law enforcement agencies, but that's a voluntary choice by Apple. This new client side scanning is a different beast. Please try to gain even the slightest fucking clue what you are talking about before spouting off on something so important.

Edit: to add for those interested in more details on the law, the federal reporting requirements are under 18 U.S. Code § 2258A [0]. What you'll see there is a "Duty To Report", and the reason for that is to evade Constitutional protections. If the government compelled companies to scan, as well as any legal challenges (by very well funded actors) and public blow back, as a practical matter those companies would become State Actors for the purposes of 4th Amendment evaluation. However, even if it's heavily incentivized so long as it's "voluntary" courts have repeated ruled that 3rd parties can do searches that would be illegal for the government, turn discovered evidence over to the government who in turn may then use it freely. Walter v. United States (1980, [1]) is a good example, covering the [righteous and just prosecution] of someone transporting "films depicting homosexual activities" after it was mailed to the wrong address and turned over to the FBI which I'm sure everyone here on HN would applaud and definitely is what they have in mind when they think of client side scanning in the US. Tim Cook is carrying on that tradition with Pride no doubt.

----

0: https://www.law.cornell.edu/uscode/text/18/2258A

1: https://www.oyez.org/cases/1979/79-67


> To be fair - wasn't that scanning on-device, and only uploading metadata on things that you yourself were already uploading to their cloud?

On-device scanning like that would be pointless, though. IIRC, stuff uploaded to their cloud is already accessible to Apple for server-side scanning. The controversial thing was the on-device scans would trigger some kind of upload of un-uploaded stuff to Apple for further investigation.


> I’m not sure I understand apples logic here. Are iCloud Photos in their data centers not scanned? Isn’t everything by default for iCloud users sent there automatically to begin with? Doesn’t the same logic around slippery slope also apply to cloud scans?

I don’t see the problem with this status quo. There is a clear demarcation between my device and their server. Each serving the interests of their owner. If I have a problem with their policy, I can choose not to entrust my data to them. And luckily, the data storage space has heaps of competitive options.


> The controversial thing was the on-device scans would trigger some kind of upload of un-uploaded stuff to Apple for further investigation.

No, a load of people assumed it would do that, but it’s not possible with the proposed scheme because the device had no knowledge of any matches.


> Since this only happens for photos on their way to iCloud, in your analogy it would be like searching the bags when you put them in the cab to go to the airport.

I think this would be more like if the TSA installed scanners in everybody's houses that scan everything as you put it in your suitcase. Yeah, it only scans things on the way to the airport, but the scanners are in my house. I don't want them there. My house is my domain.

It's the same way with my phone. I don't want the government/Apple evil detector running on my phone. I know that a false positive is unlikely and I know that Apple andthe government have made very clear they won't only look for Real Evil. I still don't want the scanners on my phone.


> Apple should just scan the pictures that are in iCloud (their servers). They just assumed that if you have the iCloud option enabled on your device that it gave them the right to do the scan on your phone/computer.

End result is the same. Difference is, that now Apple has very limited access to your images. You can only trust in closed systems. When you step into the Apple ecosystem, you are giving a lot of trust.

> I want to also point out that A/V companies never said they were going to scan for child abuse images on your computer and report you if they found any.

Why would they say, if it is perfectly legal to do anyway. They literally scan every file, so no need to mention anything specific which could lead only for negative PR.


>Why would apple have released a whole white paper on the feature if the plan was

Isn't obvious ? Some guy using an exploit would install a debugger on his iPhone and find this scanning process, Apple PR would be hit. But Apple could do this and gain PR by pretending is a safety feature, the guy with the debugger will see that files are scanned and hashes are sent to the server but he can't prove that the config file won't be updated in the future for everyone or some and the hashes will be pulled from a different database, the threashdold will change and the is iCloud enabled flag will be ignored.

What makes no sense and Apple failed to explain and fanboys also failled to convince me with any evidence, why the fuck not scan the images on iCloud only since we are pretending that only if iCloud is enabled the images are scanned. (there is no credible evidence that this is part of any plan to offer some kind of super encryption that Apple has no keys for)


>Right, but that hardly mattered as long as it applied only to iCloud-uploaded files

There were some practical differences. e.g. Some programs have a permissive default of always marking as 'save to iCloud', and avoiding this can be nonintuitive. Also certain difficulties with deleted images which I am not sure how Apple had wanted to resolve but could lead to unfortunate differences from the other scenario.

More importantly, the moment the client-side capability was there, legal pressure to use it in all cases was bound to come. Normalizing client-side scanning was also bound to legitimize and encourage doing the same on Android, and I can easily think of certain brands which are way less scrupulous than Apple.

All in all, I didn't see the benefit given that server-side scanning was accepted as legitimate and sufficiently effective by just about everyone, but without the risks of client-side scanning.


> What they got very very wrong was the public reaction to it.

What they got very wrong is that this fundamentally changes the nature of iPhones -- they are no longer user agents; in fact, they're actively scanning user data on behalf of another party. That change doesn't just make people feel uncomfortable, it opens the door for untold myriads of abuses.

It's one thing if iCloud servers scan user data - those servers never were assumed to be user agents. It's entirely different when user-owned devices do this.


> If it's only for iCloud uploaded data they can simply do the scanning there.

This is what Apple was trying to avoid. Scanning on iCloud also requires that Apple can see your photos.

If the scanning is done on device, Apple could encrypt the photos in the cloud so that they can't decrypt them at all. Neither could the authorities.

> There's no reason to use customer's CPU/battery against them.

The amount of processing ALL phones do for phones is crazy, adding a simple checksum calculation in the workflow does fuck-all to your battery life. Item detection alone is orders of magnitude more CPU/battery heavy.


> Once again, the notion that everything on the device is scanned under Apple's system was never true.

That isn't what I said.

Also, that's not why most people are so upset. Most people are so upset mainly because Apple has now proven that the capability exists, so they can now be more easily compelled by governments to scan for "extra things".

Prior to this, if a government asked Apple to scan someone's phone, Apple could respond with "we don't have that capability", and it would presumably be a tough legal battle to force a company to add a capability that doesn't exist.

This hurdle is now much lower. The effort has gone from "force Apple to design a new system for scanning phones" to "add these couple of hashes to the pre-existing database".

Also, expanding this from just iCloud upload candidates to the entire device is a very small leap now. I mean, the bad guys could just turn off iCloud, and we must think of the children...

Then you have Apple's "reassurance" that they won't comply with government requests to scan for additional things, which is completely moot considering Apple relies on a third party database and has absolutely no control or idea of what the hashes really are.


> Why they pulled the scanning

They pulled documents about the scanning, but we don't actually know they pulled the scanning itself.

> when Apple switches to not having access the keys to decrypt iCloud photos and documents.

We don't know that they are/were going to do that either. It's just a popular 'fan theory.'


> The whole point here is that Apple is not scanning server side

False. Apple complies with the laws pertaining to customer data and provides data as legally required. III. Information Available from Apple

J. iCloud

https://www.apple.com/legal/privacy/law-enforcement-guidelin...

> You need never use iCloud in any way whatsoever in order for Apple's new scanning tech to be used against you.

False. Phones don’t download the CSAM hashes so they can’t do device side scanning as they have nothing to compare the images to. Yes, the phone uploads a hash, but they also upload the unencrypted images along side it.

Thus the only thing that changes is Apple isn’t paying for the compute power to do the hashing. That and a tiny amount of extra bandwidth on uploading images.

PS: In response to your edit, perceptual hashes are a grey area. However, as long as a judge agrees they can very much just take down production systems when it pertains to a case. That’s a rather big stick to force compliance even if it’s not an explicit law it’s very much a consequence of it. Thus companies really don’t push back as some that have simply got raided.


> It's not scanning on the client side, it's hashing.

They are (their code is) looking at every item and reviewing it opaquely to us, and deciding if they should have an employee look at tiny thumbnails to tell the cops the owner is a pedophile. Its (a) not real hashing - its "perceptual" which is fancy math that says they're looking at thumbnails and (b) cycling through every item in a list and running an Algo against it is what I'd call scanning - lets not get bogged down on the terms.

We all know the tech details they shared, people on this site all understand the difference. We know the algorithm is easy enough to produce collisions. We know that apple said multiple 3 letter agencies can determine what goes in the bloom filter, and it would be downloaded from the internet and not audited. We know they are going to open it up to other countries governments too. We know that its on if you want to use opt-out services.

> The alternative, as many have pointed out, is them uploading your whole image to be scanned server side.

> Smart money is that this is a move in front of Apple encrypting your whole icloud backup

The whole image is being uploaded server side anyways.

My smart money says that this won't lead to encrypted backups anytime soon. Too many other things in those backups.

> social liability of being a free encrypted cloud storage for CSAM

I don't really think there is much tbh. That said, it is clear people would rather have non-absolutely-private cloud than a non-absolutely-private personal device. I know I would!

People criticize apple for "not letting you own the device" and this is a much bigger step down that road. I would rather a less-absolutely-private cloud service than an absolutely-private-everything where my local stuff is scanned.

> There's lots of reasons to criticise Apple, this isn't one.

There's lots of reasons to criticize Apple. Including this one.

next

Legal | privacy