> I’m not sure I understand apples logic here. Are iCloud Photos in their data centers not scanned? Isn’t everything by default for iCloud users sent there automatically to begin with? Doesn’t the same logic around slippery slope also apply to cloud scans?
I don’t see the problem with this status quo. There is a clear demarcation between my device and their server. Each serving the interests of their owner. If I have a problem with their policy, I can choose not to entrust my data to them. And luckily, the data storage space has heaps of competitive options.
> Honestly I would have been fine with Apple scanning my all photos after they are uploaded to iCloud
Apple has already been doing exactly this for years. Now they are going to check the photos right before uploading to iCloud which is actually more privacy friendly than they do now. It also lets Apple turn on e2e for iCloud photos if they want.
I understand the 'what if' and slippery slope arguments, but wow, there is so much misunderstanding in this thread. Apple makes the OS and doesn't need a fancy system to scan all the files on the device if that's what they want to do.
I highly suggest reading this: https://www.apple.com/child-safety/ and the associated PDFs. Apple PR hosed this up by putting 3 distinct features on one page that people seem to be conflating.
>It's kind of a funny situation where in Apple being more transparent than any other implementer of this same process.
It isn't the same process, it's an on-device scanning process and Apple is the first to implement this. Had Apple said they were scanning iCloud directly nobody bat an eye (I, for one, assumed they already did).
It is like you are asked or demanded to install some antivirus on your computer, but this is not a typical antivirus , it is not scanning to protect you but to find evidence against you and destroy your life.
A list of facts, let me know if you disagree with the reality or my conclussion
Apple could have scanned iCloud already, why did they not do it so far? Either there is a super low number of CP on iCloud which implies this feature is not needed OR Apple was lazy, incompetent or had some other reason and let a lot of bad guys escape . I would conclude from this that Apple , if they really care of children, should freaking start scanning the existing iCloud images now.
From the above I am inclined to believe that this is not an action to protect the children, the reality does not fit, if there is so much CP on iCloud but Apple just woke up then WTF is that all PR about protecting children.
> Exactly, I think people are just getting a little too worked up over this whole thing. Apple computes a hash of each image you upload to iCloud then check it against a list of CP hashes.
If that is what is supposed to happen, then it makes no sense for any new code to run on the device!
> Of all the things in the world to get worked up over, this is ridiculous.
Well, it is not crazy to get worked up over Apple saying they will check uploads to iCloud by checking what's on your phone - instead of simply adding code to iCloud. That seems obvious not ridiculous.
> Since this only happens for photos on their way to iCloud, in your analogy it would be like searching the bags when you put them in the cab to go to the airport.
I think this would be more like if the TSA installed scanners in everybody's houses that scan everything as you put it in your suitcase. Yeah, it only scans things on the way to the airport, but the scanners are in my house. I don't want them there. My house is my domain.
It's the same way with my phone. I don't want the government/Apple evil detector running on my phone. I know that a false positive is unlikely and I know that Apple andthe government have made very clear they won't only look for Real Evil. I still don't want the scanners on my phone.
>This potentially means all of iCloud, not just photos, could start to use E2E encryption as well - which is fantastic.
They have to scan things that are not photos. What if the bad guys just zip their photos and upload that?
There also needs to be a solution to CSAM uploaded before NCMEC had a chance to tag it, especially to cases where the bad guys uploaded their CSAM and deleted it from their iPhone. What happens then, the bad guys get E2E and nobody can find them? There has to be a technical solution in mind for this, and everything I can think of has implications (Let the iPhone store hashes of deleted images? Would it be enough to scan also during download?)
IMHO, any serious attempt to find CSAM using Apple's client-side approach requires more scanning, and Apple not being forward on that makes me trust them less. Also, the moment they expand the scanning, we should think carefully if there actually are any privacy benefits.
>The system is designed as if iCloud photos is already E2EE. It's not currently,
and it will never ever be E2E because of the US laws or if it will be ever encrypted it will use the backdoored NSA crypto (and Apple PR not even tried to hint at it to calm down the waters).
I agree the algorithm is prety clever but it feels that is not designed to solve the CSAM problem but to look good on someone CV.
Now you have the worst of both worlds, Apple has access to your photos on the server(and if they would respect the laws they should scan them for CSAM already since they are responsible on what they store and share (I mean when you share stuff)) and Apple has a scanning program inside your phone.
> Apple should just scan the pictures that are in iCloud (their servers). They just assumed that if you have the iCloud option enabled on your device that it gave them the right to do the scan on your phone/computer.
End result is the same. Difference is, that now Apple has very limited access to your images. You can only trust in closed systems. When you step into the Apple ecosystem, you are giving a lot of trust.
> I want to also point out that A/V companies never said they were going to scan for child abuse images on your computer and report you if they found any.
Why would they say, if it is perfectly legal to do anyway. They literally scan every file, so no need to mention anything specific which could lead only for negative PR.
> "The fact that you were intending to send that data to iCloud is almost irellevant to the entire discussion since apple has clearly built a system for scanning your files on your device"
This is a very dishonest take. They already have a system which runs on your device and processes all photos in the photo library, tagging categories like faces and pets and food and so on. They could have added this new system into that one with much less engineering and design effort using a system which scans all your files. They didn't, they went to a lot of extra effort not to. They already have system services e.g. location awareness or device monitoring and diagnostics, they could have built this new system as something like which would run 24/7 and access everything and be a simpler design, and they didn't.
What they did was put it into the iCloud upload feature to be matched by additional code on the iCloud server side. Something which took them more time, more effort, more complexity, and gave the whole thing less power and less flexibility.
If this system is as dreadful as all the complaints say, it should be plenty of ways to show that without having to make stuff up. If describing it accurately isn't scary enough, maybe it's not as scary as y'all want people to think.
If you’re going to classify every critical fact as a “nit” then nothing is ever wrong.
IDS vs iCloud is far more than a nit. They’re completely different services. iCloud is run by a 3rd party in China, this is well publicized, whereas IDS is not. So that’s like not a minor detail.
iMessage does not depend on iCloud. You don't need an iCloud account. These are unrelated.
Contact key verification is a more recent addition, and again not dependent on iCloud.
> How, specifically, does iCloud backup damage the affordance of E2E?
Just saying, you can sync your data to whatever encrypted or unencrypted service you want if you choose to. This may diminish the value to the end user of E2EE but it is unrelated.
I'm not the one that brought up iCloud first. Take that up with the original commenter.
>since it rationalizes a push for remote attestation
No it doesn't. By this logic, Apple should have gone through with plans to implement CSAM scanning on iCloud. Except customers complained and they abandoned it. Then they debuted e2e encrypted iCloud storage 2 years later; completely antithetical to the school of thought used by CSAM scanning advocates.
>You where already sending your unencrypted phots to Apple
No, I wasn't. The whole point here is that Apple is not scanning server side, they've built the functionality to scan device side. You need never use iCloud in any way whatsoever in order for Apple's new scanning tech to be used against you. That is a major difference.
>Apple and all other service providers are required by US and most other countries laws to do searches on their servers.
No, they are not, even if that was the new thing Apple is doing which it isn't. If a company builds server-side scanning, then they may be required to fulfill certain requirements. But companies are not required to actually do that in the first place even if many choose to do so. Apple already did scan uploaded photos and voluntarily chose not to have E2EE for iCloud data in order to please law enforcement agencies, but that's a voluntary choice by Apple. This new client side scanning is a different beast. Please try to gain even the slightest fucking clue what you are talking about before spouting off on something so important.
Edit: to add for those interested in more details on the law, the federal reporting requirements are under 18 U.S. Code § 2258A [0]. What you'll see there is a "Duty To Report", and the reason for that is to evade Constitutional protections. If the government compelled companies to scan, as well as any legal challenges (by very well funded actors) and public blow back, as a practical matter those companies would become State Actors for the purposes of 4th Amendment evaluation. However, even if it's heavily incentivized so long as it's "voluntary" courts have repeated ruled that 3rd parties can do searches that would be illegal for the government, turn discovered evidence over to the government who in turn may then use it freely. Walter v. United States (1980, [1]) is a good example, covering the [righteous and just prosecution] of someone transporting "films depicting homosexual activities" after it was mailed to the wrong address and turned over to the FBI which I'm sure everyone here on HN would applaud and definitely is what they have in mind when they think of client side scanning in the US. Tim Cook is carrying on that tradition with Pride no doubt.
> I don't buy into theories that Apple is being pressured or coerced on any of this.
Of course they were.
It’s well known that Apple chose not to introduce e2e encrypted iCloud backups back i 2017/18 or so, due to FBI complaints.[1]
This is clearly Apple’s play to be able to introduce that again and tell law enforcement “Look, we ‘thought of the children’, if you want further access to our customers data, you’re going to need to come up with a better justification than that.”
If Apple pull that off, adopting client side image scanning with this quite impressive privacy preserving system behind it, and then e2e encrypt everything they upload to iCloud, that’d arguably be a very big with for Apple customers privacy.
Whether that’s an acceptable trade off for having a device I purchased run code I didn’t ask for and don’t want to monitor where or not I’m a paedophile, possibly snitching on me for false positives or bad-faith additions of non CSAM hashes into the database, or not is a good question still.
>Why would apple have released a whole white paper on the feature if the plan was
Isn't obvious ? Some guy using an exploit would install a debugger on his iPhone and find this scanning process, Apple PR would be hit. But Apple could do this and gain PR by pretending is a safety feature, the guy with the debugger will see that files are scanned and hashes are sent to the server but he can't prove that the config file won't be updated in the future for everyone or some and the hashes will be pulled from a different database, the threashdold will change and the is iCloud enabled flag will be ignored.
What makes no sense and Apple failed to explain and fanboys also failled to convince me with any evidence, why the fuck not scan the images on iCloud only since we are pretending that only if iCloud is enabled the images are scanned. (there is no credible evidence that this is part of any plan to offer some kind of super encryption that Apple has no keys for)
>Sure, but I don't particularly care because I don't upload sensitive photos to iCloud
So you are saying you have nothing to fear because you aren't hiding anything?
Is this a tacit admission that you don't want them scanning your phone for CSAM because they will find something?
I'm obviously not seriously accusing you of anything, just pointing out how your line of argument applies equally to privacy whether on the cloud or on your device.
The large number of people here who want iCloud to be E2E have a problem with it.
> (don’t they do it already?).
Apparently not for files and photos. They may well do it for email.
reply