Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

I really wish the people – especially those on HN – would take a broader look at what Apple is proposing and better understand the forces at play before being so critical of the tech. I understand the initial knee-jerk anti-privacy response, but since there has been some time for people to learn all the facts, I remain amazed that they never come up in these threads.

First, you hear a lot of people, including Snowden (while contradicting himself), say this isn't really about CSAM. That point is absolutely correct. This is ALL two things, each addressed here:

1. Legal liability, and the cost of processing as many subpoenas as they do.

Ultimately, Apple has the keys to the data they store on their servers. They could easily encrypt all the data using on-device keys, before uploading to ensure they can't actually see anything. But this would cause a huge backlash from law enforcement that would cause congress to pass legislation mandating backdoors. In fact, Apple (big tech) has been trying to hold off that legislation since at least 2019, when they met with the Senate Judiciary committee [1].

Quote from EFF article:

> Many of the committee members seemed to arrive at the hearing convinced that they could legislate secure backdoors. Among others, Senators Graham and Feinstein told representatives from Apple and Facebook that they had a responsibility to find a solution to enable government access to encrypted data. Senator Graham commented, “My advice to you is to get on with it, because this time next year, if we haven't found a way that you can live with, we will impose our will on you.”

Apple is doing exactly what Graham told them to do. They have come up with a system that manages to increase security for most users by ensuring that nobody - not even Apple - has the decryption keys for your data, while also satisfying law enforcement to the degree necessary to prevent really harmful anti-privacy legislation. They managed to do it in a really creative way.

It's not perfect of course. There are plenty of people with valid concerns, such as the potential for hash collisions and how a country like China might try to abuse the system and whether Apple would give into that pressure (as they did in the past). All of that is valid, and I'm glad to see Apple stop and examine all the complaints before pushing the release. But strictly on the topic of privacy, the new system will be a massive improvement.

2. User privacy. Yes, everyone thinks this is an invasion of privacy, but I just don't see how. The proposed on-device scanning solution provides MORE privacy than either the current iCloud system (in which Apple can be compelled to decrypt nearly all of your data) or the proposed [2] legislation – MORE privacy even for people found to meet the CSAM threshold!

It seems to me there must be a lot of misunderstanding surrounding the encryption mechanisms Apple has proposed. But having read the technical documents, my view (again, strictly from a privacy standpoint) is that it appears to be extremely sound.

Essentially, there are currently two parties that can decrypt your iCloud data with master keys – you and Apple.

In VERY greatly simplified terms the new system will set one master decryption key on your device. But Apple will now instead use shared key encryption, which requires ALL of the ~31 keys to be present to decrypt the photos. Apple will have one of those keys. The other 30 (the "threshold") keys will be generated by a hash (of a hash of a hash) of the match found in the CSAM database. If no match is found, then the shared key needed to decrypt that image is never generated. It doesn't exist.

One way to look at this is that it's the CSAM images that are the keys to unlocking the CSAM images. Without them, Apple cannot comply with a subpoena (for photos ... for now). Even people who meet the CSAM threshold, can only have the CSAM images decrypted. All other photos that have no match in the CSAM database cannot be decrypted without access to the suspect's phone.

On the flip side, Apple is bending to congress's demands by voluntarily sharing information with law enforcement. I can absolutely understand how this behavior could make even perfectly innocent people feel uncomfortable. But in the context of the understanding that you have more privacy for yourself, while exposing those who deal in CSAM (and are dumb enough to store it in their iCloud account), I have to push my logical understanding to overcome my natural but unwarranted discomfort. Anything that prevents the government from getting universal backdoor into everyone's phone is a win, in my opinion.

[1] https://www.eff.org/deeplinks/2019/12/senate-judiciary-commi...

[2] https://www.eff.org/deeplinks/2020/06/senates-new-anti-encry...



view as:

A great many people have attempted to explain why they are against this particular mechanism for detecting CSAM. I agree with you that Apple's implementation is technically impressive and probably the most private way to performing this action on device. However, I disagree that it's more private than the current cloud scanning. If the scanning happens client-side, then I have absolutely no control over what gets scanned and when. If the scanning is server-side, then I can simply not upload anything to the cloud and no scanning happens. I can't avoid client-side scanning like I can avoid server-side scanning.

I realize this is a simplification of the actual method Apple has implemented and as it currently stands it would only scan photos that are destined to be uploaded to the cloud. If it were guaranteed that would never change then I think a lot more people wouldn't have a problem with it. But it will be abused. Every[1] single[2] time[3] this sort of system is implemented "for the children" it gets abused. The slippery slope here is real and well-demonstrated in various countries around the world.

For my part I have come across an imperfect analogy that I feel accurately captures how I feel about Apple's solution. My phone is like my diary. There's nothing illegal in there. But there is stuff that is deeply personal, private, and even some that would be terribly embarrassing if the wrong person saw it. As long as I keep my diary to myself and don't let anyone see it I have nothing to worry about. If I were to send my diary off to someone else known to read diaries then it's my own fault as much as anything else if it gets read and intimate details of my life known.

[1] https://en.wikipedia.org/wiki/Internet_censorship_in_the_Uni...

[2] https://en.wikipedia.org/wiki/Internet_censorship_in_Austral...

[3] https://en.wikipedia.org/wiki/Internet_censorship_in_Russia


I am seeing a lot of these contrived tedious comments shilling for this company.

Legal | privacy