If you send any of those illegal photos to Facebook, Facebook will snitch on you. If you send them to iCloud (under this described system), Apple will snitch on you. It is no different to "scanning your uploads" because it /is/ "scanning your uploads". That it's the CPU in the phone doing most of the work and iCloud servers making the connection, vs Facebook's CPU and cloud doing all the work, makes zero practical difference to anything.
Arguing about where in the combined system the processing happens is shifting the deck-chairs on the Titanic, it's not making one part of the system more or less responsible than any other part.
I do agree that it's more private, but I'm not sure it's better.
I'm fine with the idea that if I upload stuff to someone else's server, they may take a look at it and maybe even punish me for what I've uploaded. Certainly if I encrypt the data before I upload it, they can't do that. But if I don't, then it's fine with me if they do.
But my device should not be snitching on me. Yes, this device-side scanning is supposedly gated on enabling uploads to iCloud, but that doesn't really make for much of a distinction to me. And since Apple certainly has the capability, they are likely one secret court order away from being required to scan even photos that aren't being uploaded, at least on some targeted subset of devices.
What distinction are you making with what I stated? What disagreement is there?
Apple only scans photos being uploaded to iCloud photos. Google scans. Facebook scans. Microsoft scans. Even tiny image hosting sites scan.
Apple decided to implement this functionality on device, but they could as easily (with much less fanfare and dissent) have placed it on the ingress to iCloud.
Apple actually isn't legally liable for what users upload until it's reported to them. And they are capable of doing the scanning server-side, since iCloud doesn't use end-to-end encryption.
If scanning is done in the cloud, I can disable cloud uploads and reasonably trust no filter list will get me reported. If it’s done on-device, I have to rely on Apple to uphold their policy of only scanning material uploaded to iCloud. That’s a big difference.
You’ve just described every photo upload service in America, although my understanding was Apple would use a list of hashes of known bad content, not an “AI” as google does.
Everyone scans for CSAM. I am not conjecturing on the ethics of scanning photos here, I am suggesting that moving from server- to client-side scanning had no effect on any of the things you are ranting about. Hence why I do not understand the outrage.
> Working for me means not having software designed to report me to the police for how I use my device.
Let me fix it: “how I use iCloud Photos, a hosted service on Apple’s servers”.
seems like a dumb thing to argue about. everything you're saying is moot as you can just plug in your phone to your computer and upload photos without scanning, or use a self-hosted file upload service, etc.
your compute and phone resources are already used for analytics and other things that you didn't explicitly tell the phone to do.
you also greatly overestimate how much CPU it takes to compare a hash, lol
fact of the matter is, you're only scanned when you're using apple services.
The difference is huge. Apple’s proposed system scans everyone’s(who uses iCloud) photos and reports them to the authorities(with Apple being a proxy). It’s a snitch/God system.
On my proposal, all photos on the device are scanned, scan results(hashes) are stored on device but no attempts to pin you down are made. Once there’s a reason to suspect that you might be a criminal, only then the scan results are evaluated by the phone upon the request of the authorities to check on you as part of the investigation. This is blackbox system.
You know it’s still only photos being uploaded to iCloud that are scanned right? Or are you just totally unfamiliar with the actual issue? If you are familiar, can you explain the practical difference between these approaches that makes one worse?
As the article states, the scanning only occurred if you had iCloud Photos enabled. From what I recall, it worked like:
1. Use on device ML to scan for child porn before iCloud Photos upload.
2. If not found, issue a cryptographic ticket for iCloud upload and include in upload request.
…the whole point of Apple’s scheme as far as I know was to keep as much of the processing on device as possible while also keeping illegal content off of their cloud servers.
I'm talking from the perspective of a service-owner who handles user-generated content. You typically set up PhotoDNA on two parts of your infrastructure. You scan client-side and refuse uploads to prevent the images from getting on your servers in the first place and then you scan on the backend to catch anything that slipped through. You do the first part so you don't have to call the FBI.
I really don't care about the semantics of data ownership. If Apple wants to scan photos you upload to iCloud so they don't run into a scandal years down the road that "iCloud is being used to distribute CP" then that's their call.
And to the letter of the law you can't just delete the data if you found CP that one of your customers uploaded without running afoul of mandatory reporting laws.
If you don't choose upload to icloud, no upload to apple at all.
If you do choose icloud upload (most do), they were being uploaded already and stored and may be available to law enforcement.
If you do upload to icloud, NOW they will be screened for matches with "known" images in a database, and if you have more than a threshold number of hits, you may be reported. This will happen on device.
Apple will also scan photos in their cloud system as well from what I can tell (though once on device is working less should land in cloud).
Note that it is HIGHLY likely that google photos / facebook / instagram and others will or are already doing similar scanning and reporting. I've heard millions of reports go in a year.
My understanding is Apple is not scanning photos on the device, but scanning photos uploaded to their servers: i.e. iCloud. I also understand this is the norm for most (all?) cloud-hosted file and photo storage platforms. I can opt-out having my photos uploaded to iCloud (which I do, but not for security concerns).
Yeah, it's weird. Speaking purely personally, whether the scanning happens immediately-before-upload on my phone or immediately-after-upload in the cloud doesn't really make a difference to me. But this is clearly not a universal opinion.
The most-optimistic take on this I can see is that this program could be the prelude to needing to trust less people. If Apple can turn on e2e encryption for photos, using this program as the PR shield from law enforcement to be able to do it, that'd leave us having to only trust the OS vendor.
It would scan photos you were uploading to iCloud, not private photo libraries on your phone. I'm sure you'll agree it's important to correct such a misunderstanding as one of those is a lot more invasive than the other.
As another poster said, it's not a choice of whether or not your content is scanned; it's a choice of where. If you upload pictures to the cloud—which is the only scenario in which Apple's scanning was stated to happen¹—then it's a choice between scanning on your device, which allows for the possibility of E2E encryption, or definitely no encryption and scanning on the server.
At present, Apple doesn't scan photos on the server, but all their competitors do, and I don't doubt for a second that they will eventually start scanning photos as well. The choice is not if, but how, and their client-side solution seems to me to be much more privacy-preserving than server-side scanning.
¹If you don't believe them, that's fine, but given that they have root control over the software running on your phone, your only choice is to either believe them or don't use their phones. Same goes for all other phones.
Arguing about where in the combined system the processing happens is shifting the deck-chairs on the Titanic, it's not making one part of the system more or less responsible than any other part.
reply