>It's kind of a funny situation where in Apple being more transparent than any other implementer of this same process.
It isn't the same process, it's an on-device scanning process and Apple is the first to implement this. Had Apple said they were scanning iCloud directly nobody bat an eye (I, for one, assumed they already did).
It is like you are asked or demanded to install some antivirus on your computer, but this is not a typical antivirus , it is not scanning to protect you but to find evidence against you and destroy your life.
A list of facts, let me know if you disagree with the reality or my conclussion
Apple could have scanned iCloud already, why did they not do it so far? Either there is a super low number of CP on iCloud which implies this feature is not needed OR Apple was lazy, incompetent or had some other reason and let a lot of bad guys escape . I would conclude from this that Apple , if they really care of children, should freaking start scanning the existing iCloud images now.
From the above I am inclined to believe that this is not an action to protect the children, the reality does not fit, if there is so much CP on iCloud but Apple just woke up then WTF is that all PR about protecting children.
> If it's only for iCloud uploaded data they can simply do the scanning there.
This is what Apple was trying to avoid. Scanning on iCloud also requires that Apple can see your photos.
If the scanning is done on device, Apple could encrypt the photos in the cloud so that they can't decrypt them at all. Neither could the authorities.
> There's no reason to use customer's CPU/battery against them.
The amount of processing ALL phones do for phones is crazy, adding a simple checksum calculation in the workflow does fuck-all to your battery life. Item detection alone is orders of magnitude more CPU/battery heavy.
>This makes it seem like there is no upside at all to the CSAM scanning. There is: we stop more folks peddling child porn, hopefully making a dent in the problem.
Apple could have scanned iCloud from the start and prevent this problem if it is so big. If they are already scanning then there is no use for scanning on device only if iCloud is on and if they are not yet scanning iCloud then you should ask Tim why is he ignoring the problem, Google and Facebook already reported a lot of abuses, does Tim love CP?
> They are trying to prevent CSAM images from being stored and distributed using Apple products.
If what Apple is aiming for is a more complete version of E2EE on their servers, maybe that's just an unintended consequence of the implementation, and the very reason why they're surprised that this received so much pushback. If Apple wanted to offer encryption for all user files in iCloud and leave no capability to decrypt the files themselves, they'd still need to be able to detect CSAM to protect themselves from liability. In that case, scanning on the device would be the only way to make it work.
If that were the case, I still wouldn't believe that moving the scan to the device fundamentally changes anything. Apple has to conduct a scan regardless, or they'll become a viable option for criminals to store CSAM. But in Apple's view, their implementation would mean they'd likely be the first cloud company that could claim to have zero knowledge of the data on on their servers while still satisfying the demands of the law.
Supposing that's the case, maybe what it would demonstrate is that no matter how you slice it, trying to offer a fully encrypted, no-knowledge solution for storing user data is fundamentally incompatible with societal demands.
But since Apple didn't provide such an explanation, we can only guess what their strategy is. They could have done a lot better job at describing their motivations, instead of hoping that the forces of public sentiment would allow it to pass like all the other scanning mechanisms actually had in the past.
> Since this only happens for photos on their way to iCloud, in your analogy it would be like searching the bags when you put them in the cab to go to the airport.
I think this would be more like if the TSA installed scanners in everybody's houses that scan everything as you put it in your suitcase. Yeah, it only scans things on the way to the airport, but the scanners are in my house. I don't want them there. My house is my domain.
It's the same way with my phone. I don't want the government/Apple evil detector running on my phone. I know that a false positive is unlikely and I know that Apple andthe government have made very clear they won't only look for Real Evil. I still don't want the scanners on my phone.
> The only one I agree on is the image scanning for CSAM. The idea of a device I own acting as a state informer using AI to detect what it thinks is a crime is not my idea of a step forward.
There's also a convenient place to turn it off: CSAM scanning doesn't happen if you don't use iCloud photos/files syncing.
>This is effectively a virus scanner. Files are hashed (in a fancy way), compared against known hashes, and matches are reported
yeah with the small difference that the virus scanner reports to you, whereas this scanner reports to Apple or authorities.
The virus scanner's purpose is to alert you of viruses on your machine, the purpose of apple's scanner is to engage in blanket surveillance and treat ordinary users like potential consumers of CSAM by default.
Nothing about this is privacy preserving. Privacy would be preserved if Apple refrains from touching any of the information that belong to me and doesn't treat their customers like potential criminals. Imagine you rent parking space for your car and at random intervals, with no reason at all, nothing suspicious has ever happened, the owner comes up, opens your trunk, and rummages through it to check for child porn. That's what Apple is doing.
Since when has renting storage space ever entitled anyone to check what the customer puts in the storage? Do you expect the bank clerk to crawl through your personal safe deposit box as well to prevent crime?
> Honestly I would have been fine with Apple scanning my all photos after they are uploaded to iCloud
Apple has already been doing exactly this for years. Now they are going to check the photos right before uploading to iCloud which is actually more privacy friendly than they do now. It also lets Apple turn on e2e for iCloud photos if they want.
I understand the 'what if' and slippery slope arguments, but wow, there is so much misunderstanding in this thread. Apple makes the OS and doesn't need a fancy system to scan all the files on the device if that's what they want to do.
I highly suggest reading this: https://www.apple.com/child-safety/ and the associated PDFs. Apple PR hosed this up by putting 3 distinct features on one page that people seem to be conflating.
>Why would apple have released a whole white paper on the feature if the plan was
Isn't obvious ? Some guy using an exploit would install a debugger on his iPhone and find this scanning process, Apple PR would be hit. But Apple could do this and gain PR by pretending is a safety feature, the guy with the debugger will see that files are scanned and hashes are sent to the server but he can't prove that the config file won't be updated in the future for everyone or some and the hashes will be pulled from a different database, the threashdold will change and the is iCloud enabled flag will be ignored.
What makes no sense and Apple failed to explain and fanboys also failled to convince me with any evidence, why the fuck not scan the images on iCloud only since we are pretending that only if iCloud is enabled the images are scanned. (there is no credible evidence that this is part of any plan to offer some kind of super encryption that Apple has no keys for)
> With regards to CSAM, I think the same applies. After all the CSAM issue only exists because the US government has decided that invasive monitoring is the only way to counter CSAM.
No it hasn't. There is no legislation that the government passed or enforces that says Apple must scan people's private data on their devices for CSAM. Apple decided to do that all on their own.
Did we? I suspect the real issue the original client-side proposal had a lot of holes. What if the bad guys upload CSAM which hasn't been tagged by NCMEC yet? What if they then delete it from the device (but keep it on iCloud)? Or what if they zip the CSAM images and backup that?
In order to be even semi-effective, the client-side scanning has to be more invasive, or they have to implement server-side scanning too. Apple may well be looking whether they can implement this scanning without even more backlash.
> I’m not sure I understand apples logic here. Are iCloud Photos in their data centers not scanned? Isn’t everything by default for iCloud users sent there automatically to begin with? Doesn’t the same logic around slippery slope also apply to cloud scans?
I don’t see the problem with this status quo. There is a clear demarcation between my device and their server. Each serving the interests of their owner. If I have a problem with their policy, I can choose not to entrust my data to them. And luckily, the data storage space has heaps of competitive options.
> What I don't get is what prevented these things from happening last month? Apple controls the hardware, the software, and the cloud services...
Yes, proprietary black-box hardware and software is poor from a user privacy perspective. But, If Apple began on-device scanning of content, I'd imagine eventually someone would notice the suspicious activity and investigate.
With Apple's announcement, the scanning will just be something that Apple devices do. Nothing to worry about. And, no way for anyone to independently verify that the scope of the content being scanned has not been secretly increased.
As for icloud, if your content is not encrypted on the device in a manner where only you have the keys, any cloud storage is suspect for scanning / data mining. But, on-device scanning is a back door for e2e encryption-- even on device encryption with keys only you control is thwarted.
>since it rationalizes a push for remote attestation
No it doesn't. By this logic, Apple should have gone through with plans to implement CSAM scanning on iCloud. Except customers complained and they abandoned it. Then they debuted e2e encrypted iCloud storage 2 years later; completely antithetical to the school of thought used by CSAM scanning advocates.
>This potentially means all of iCloud, not just photos, could start to use E2E encryption as well - which is fantastic.
They have to scan things that are not photos. What if the bad guys just zip their photos and upload that?
There also needs to be a solution to CSAM uploaded before NCMEC had a chance to tag it, especially to cases where the bad guys uploaded their CSAM and deleted it from their iPhone. What happens then, the bad guys get E2E and nobody can find them? There has to be a technical solution in mind for this, and everything I can think of has implications (Let the iPhone store hashes of deleted images? Would it be enough to scan also during download?)
IMHO, any serious attempt to find CSAM using Apple's client-side approach requires more scanning, and Apple not being forward on that makes me trust them less. Also, the moment they expand the scanning, we should think carefully if there actually are any privacy benefits.
> Yes, you can turn off iCloud Photos to disable Apple’s scanning
To me, it is such a weird implementation. The feature is about scanning on-device data, not data in iCloud, but for some reason disabling iCloud is the way to opt out? Did I miss any technical details?
Apple is always pretty OK with handing over iCloud data to authorities. Not only that they have already stored Chinese users iCloud data in state controlled entity, this was their whole argument regarding the San Bernardino case -- on-device data is user's sacred privacy, but if the shooter's phone uploaded data to iCloud, they were willing to send a copy to FBI in a heartbeat.
This makes me wonder, is iCloud the underlying technical boundary for privacy?
> They (and Google, Microsoft, Facebook, etc.) are essentially mandated reporters; if CSAM is on their servers, they're required to report it.
I don't think that's correct. My understanding is that if they find CSAM, they're obligated to report it (just like anyone is). I don't believe they are legally obligated to proactively look for it. (It would be a PR nightmare for them to have an unchecked CSAM problem on their services, so they do look for it and report it.)
Consider that Apple likely does have CSAM on their servers, because they apparently don't scan iCloud backups right now. I don't believe they're breaking any laws, at least until and unless they find any of it and (hypothetically) don't report it
>You where already sending your unencrypted phots to Apple
No, I wasn't. The whole point here is that Apple is not scanning server side, they've built the functionality to scan device side. You need never use iCloud in any way whatsoever in order for Apple's new scanning tech to be used against you. That is a major difference.
>Apple and all other service providers are required by US and most other countries laws to do searches on their servers.
No, they are not, even if that was the new thing Apple is doing which it isn't. If a company builds server-side scanning, then they may be required to fulfill certain requirements. But companies are not required to actually do that in the first place even if many choose to do so. Apple already did scan uploaded photos and voluntarily chose not to have E2EE for iCloud data in order to please law enforcement agencies, but that's a voluntary choice by Apple. This new client side scanning is a different beast. Please try to gain even the slightest fucking clue what you are talking about before spouting off on something so important.
Edit: to add for those interested in more details on the law, the federal reporting requirements are under 18 U.S. Code § 2258A [0]. What you'll see there is a "Duty To Report", and the reason for that is to evade Constitutional protections. If the government compelled companies to scan, as well as any legal challenges (by very well funded actors) and public blow back, as a practical matter those companies would become State Actors for the purposes of 4th Amendment evaluation. However, even if it's heavily incentivized so long as it's "voluntary" courts have repeated ruled that 3rd parties can do searches that would be illegal for the government, turn discovered evidence over to the government who in turn may then use it freely. Walter v. United States (1980, [1]) is a good example, covering the [righteous and just prosecution] of someone transporting "films depicting homosexual activities" after it was mailed to the wrong address and turned over to the FBI which I'm sure everyone here on HN would applaud and definitely is what they have in mind when they think of client side scanning in the US. Tim Cook is carrying on that tradition with Pride no doubt.
Because it's a pretense for further surveillance.
reply