Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

I really wish the people – especially those on HN – would take a broader look at what Apple is proposing and better understand the forces at play before being so critical of the tech. I understand the initial knee-jerk anti-privacy response, but since there has been some time for people to learn all the facts, I remain amazed that they never come up in these threads.

First, you hear a lot of people, including Snowden (while contradicting himself), say this isn't really about CSAM. That point is absolutely correct. This is ALL two things, each addressed here:

1. Legal liability, and the cost of processing as many subpoenas as they do.

Ultimately, Apple has the keys to the data they store on their servers. They could easily encrypt all the data using on-device keys, before uploading to ensure they can't actually see anything. But this would cause a huge backlash from law enforcement that would cause congress to pass legislation mandating backdoors. In fact, Apple (big tech) has been trying to hold off that legislation since at least 2019, when they met with the Senate Judiciary committee [1].

Quote from EFF article:

> Many of the committee members seemed to arrive at the hearing convinced that they could legislate secure backdoors. Among others, Senators Graham and Feinstein told representatives from Apple and Facebook that they had a responsibility to find a solution to enable government access to encrypted data. Senator Graham commented, “My advice to you is to get on with it, because this time next year, if we haven't found a way that you can live with, we will impose our will on you.”

Apple is doing exactly what Graham told them to do. They have come up with a system that manages to increase security for most users by ensuring that nobody - not even Apple - has the decryption keys for your data, while also satisfying law enforcement to the degree necessary to prevent really harmful anti-privacy legislation. They managed to do it in a really creative way.

It's not perfect of course. There are plenty of people with valid concerns, such as the potential for hash collisions and how a country like China might try to abuse the system and whether Apple would give into that pressure (as they did in the past). All of that is valid, and I'm glad to see Apple stop and examine all the complaints before pushing the release. But strictly on the topic of privacy, the new system will be a massive improvement.

2. User privacy. Yes, everyone thinks this is an invasion of privacy, but I just don't see how. The proposed on-device scanning solution provides MORE privacy than either the current iCloud system (in which Apple can be compelled to decrypt nearly all of your data) or the proposed [2] legislation – MORE privacy even for people found to meet the CSAM threshold!

It seems to me there must be a lot of misunderstanding surrounding the encryption mechanisms Apple has proposed. But having read the technical documents, my view (again, strictly from a privacy standpoint) is that it appears to be extremely sound.

Essentially, there are currently two parties that can decrypt your iCloud data with master keys – you and Apple.

In VERY greatly simplified terms the new system will set one master decryption key on your device. But Apple will now instead use shared key encryption, which requires ALL of the ~31 keys to be present to decrypt the photos. Apple will have one of those keys. The other 30 (the "threshold") keys will be generated by a hash (of a hash of a hash) of the match found in the CSAM database. If no match is found, then the shared key needed to decrypt that image is never generated. It doesn't exist.

One way to look at this is that it's the CSAM images that are the keys to unlocking the CSAM images. Without them, Apple cannot comply with a subpoena (for photos ... for now). Even people who meet the CSAM threshold, can only have the CSAM images decrypted. All other photos that have no match in the CSAM database cannot be decrypted without access to the suspect's phone.

On the flip side, Apple is bending to congress's demands by voluntarily sharing information with law enforcement. I can absolutely understand how this behavior could make even perfectly innocent people feel uncomfortable. But in the context of the understanding that you have more privacy for yourself, while exposing those who deal in CSAM (and are dumb enough to store it in their iCloud account), I have to push my logical understanding to overcome my natural but unwarranted discomfort. Anything that prevents the government from getting universal backdoor into everyone's phone is a win, in my opinion.

[1] https://www.eff.org/deeplinks/2019/12/senate-judiciary-commi...

[2] https://www.eff.org/deeplinks/2020/06/senates-new-anti-encry...



sort by: page size:

The only two reactions to this news seems to be cynicism and praise. I think both voices in response to Apple's letters are valid and useful for improving user security in the future, and yet incomplete by themselves.

Sure, Apple should be praised for refusing to give government agencies the ability to unlock an iPhone, but a significant part of their motivation is not altruistic. It's in Apple's self-interest to make a stand in this case, but we can't always trust corporations to prioritize customer privacy over caving to government pressure.

Similarly, Apple has already admitted that a backdoor exists for all iPhones. In my opinion, this is an inexcusable security hole at best, and at worst an implication that Apple intended at some point to comply with government requests for encrypted information. However, the fact that the FBI has made this request in the first place, and that Apple is in a position to decline (at least initially) and make it public, is a good sign that the three-letter agencies may not be as all-knowing as some may fear.


Apple has stood up to unreasonable government invasions of privacy before. And won. They have the means. They have the skill. They have the incentive (their entire brand is arguably built around privacy) and their users have been pretty clear in voicing their opposition to CSAM.

Some nebulous political references in opposition to encryption aren't the reason Apple did this in the first place. They had and continue to have plenty of choice in the matter.


I completely agree that backdoors are a terrible invasion of privacy, that law enforcement gained absolutely nothing in this case, and that Apple should be lauded for their bold opposition to these requests.

I'm merely attacking the specific line of argumentation the EFF is using. I consider it both intellectually shoddy and bound to be counter-productive.

(Unfortunately, HN doesn't appear to understand such nuances.)


Apple was concerned about governments using the excuse of CSAM to pass laws which would force Apple to weaken encryption across the board.

Whether this was the right response to such concern is something I’m not unsympathetic towards. Certainly I think it’s reasonable to say that Apple was trying to thread a needle in a way which was never going to please everyone, even if it somehow turns out to have been the least-worst outcome.


Well I can't expect HN to agree with this article, but let's face it, technologies pose new political questions.

I think the argument about the iPhone backdoor software is a slippery slope one. As long as the judicial system use mandates and the process is transparent, I think you can't really complain. This is what Snowden is about.

We're talking about homeland defense here, the Islamic State is on the rise, so I would be careful when defending Apple here.

Now I'm not a lawyer nor a political scientist, but I want to side with the FBI on this one. I'm sure Apple is playing their popularity card here. There are things that are more important and go beyond gadgets built in the silicon valley. It doesn't necessarily have something to do with Snowden.

I expect people to disagree with me here, and it's fine.


"Do keep in mind that at least one govt has successfully pressured apple to give up on its privacy"

No company can defend you from your government.

"All it would take"...

That is the slippery slope. If a government is going to say "that's a nice looking hashing system you have there, now we need you to..." they could as easily -- more easily -- have said "that's a nice filesystem you have there, we need you to...".

Hashing files and comparing them against a list is literally a college grad afternoon project. There is absolutely nothing in Apple's announcement that empowers any government anywhere in any meaningful way at all. It is only fear-mongering (or simply raw factual errors as seen throughout this discussion) that makes it seem like it does.


It remains one of the clearest examples of law enforcement wanting a friction-free solution for getting data out of iOS without Apple. They could have easily attained the information and have been doing so for years. They were explicitly trying to generate sympathy towards a backdoor solution.

What's sort of surprising to me is how much they overestimated public support for their cause.


I don't misunderstand at all. I just have a different opinion than you do. However, a lot of misinformation is being spread.

The FBI isn't asking for Apple to create a backdoor. End of story. They're compelling Apple to open a backdoor that Apple already created. Apple made an insecure system. They can, and I think should fix it, but in this phone at least the back door is there.

So, wait to use a slippery slope argument when it actually becomes applicable. You're so trigger happy to jump on the privacy, precedent setting bandwagon you didn't stop to think what this case actually is.

Basically, as long as you make insecure systems, the government can (by the fourth amendment no less) demand you let them in if they have probable cause and a warrant. If you make something impossible to get into, and the government starts demanding you stop making devices like that, then we have a problem. But this particular case doesn't get us any closer to that outcome. Not even symbolically.

I think you and everyone else overreacting to this case grasps gravity that isn't there. You're tilting at windmills.


The real TLDR is at the end of the article:

… the argument that Apple has enabled a law enforcement backdoor seems to miss what Apple has actually done. Instead of building a system that allows the company to recover your secret information, Apple has devoted enormous resources to locking themselves out.


Apple is in crossfire:

(a) There is pressure from many governments to give backdoor for surveillance. Or just comply with subpoenas that are against human rights.

(b) Complying with local laws generates PR damage. It makes privacy and ethics as a brand strategy look disingenuous.

The solution is, of course, to generate truly secure system where Apple can't make backdoors. Those services may not be available in some countries, but then it's just missing service, not a compromised system.


I don’t think Apple has much choice in the matter, and I really wish more people understood why they did this in the first place.

https://www.eff.org/deeplinks/2019/12/senate-judiciary-commi...


They could be fined hundreds of thousands daily for refusing to act. We recently found out this happened to Yahoo in 2008 [1] via the FISA court. Apple shareholders wouldn't like that.

I'm pretty sure Tim Cook and his lawyer mean it when they say they will comply with the law. Apple engineers need not worry so much about quitting their jobs over this. If I were there I would stick with Tim Cook. Consider,

If Apple loses this case, this becomes a gigantic public debate where we scramble to enact legislation that removes this power from the government.

If Apple wins this case, and the government pursues anti-encryption laws, this becomes a gigantic public debate.

If the case is delayed for 2 years and goes to the supreme court, this becomes a gigantic public debate.

Regardless, since this is in the public sphere, whether-or-not-we-put-back-doors-in-phones is going to become a gigantic public debate.

The only way it doesn't become a big debate is if Obama comes out and says he's been informed on the issue and now realizes encryption in our phones is, on balance, a good thing.

[1] http://www.theguardian.com/world/2014/sep/11/yahoo-nsa-lawsu...


"You should better educate yourself, the article is about seizure with a warrant, with which Apple and Google will no longer be able to help them."

To add to the other great comments:

I understand that it is via warrant. In fact, the article states that Apple said, "It's not technically feasible for us to respond to government warrants [...]".

My concern is perhaps more nuanced. Put aside the warrant issue for a moment. That is to say, where do we draw the line and say, "This type of sweeping, 'open everything up and stop encrypting' request is a violation of, 'the right of the people to be secure in their persons, houses, papers, and effects?'"

In summary, my contention is that forcing companies to open up in this manner violates the explicit right of the people to be secure.


> the FBI backed off, probably fearful of the PR consequences.

There was also a PR battle involved and Apple won.

Defending encryption is hard, because it is primarily a PR battle and the enemy always has the high ground. Notice how all these cases hinge on some terrible crime - terrorism, human trafficking, etc. Because the govt then gets to say "Aha, so who wants to stand up and defend terrorists!? Nobody> That's what we thought, so let's pass this new law then".

But what Apple did (and kudos to their PR team) is turn it around said it wasn't just a 1st Amendment issue, but also a practical personal safety risk issue. Not having encryption means being exposed to identity theft and fraud. It is not just something abstract but a specific and real danger that everyone either experienced or knows someone who it happened to.

Read it here: https://www.apple.com/customer-letter/

It is really a great example of good PR and a good punch back in the encryption battle. It helps sometimes when a tech giant throws their weight behind this.


Here's my take on this

I think this was Apple calling the US government's bluff. I don't think they ever wanted to do anything like this because they know it destroys their claims of superior privacy. I think they have internal plans to roll out E2E iCloud encryption so that in addition to not being able to provide law enforcement the code to unlock any phone, they also won't be able to help law enforcement decrypt anything stored in iCloud, including those phone backups. So Apple sees the incoming fight where government cries foul to the public, making the same tried and true "think of the children" arguments, saying now all pedophiles will be free to save their CSAM on Apple servers. It's the government's way to try to get the public against Apple on this, and this is how Apple neuters that argument.

But I don't think anyone in the government really cares about the CSAM. What they really want is backdoor access to the devices for the high-value targets that they actually care about, or just to make their jobs a lot easier in other investigations, but that's a lot harder to sell to the public. Look at Facebook alone. They made 20 MILLION CSAM reports last year ALONE. That number was astounding to me. I haven't heard anyone discuss the sheer numbers yet. Think about that for a second. That's one company, making 55,000 CSAM reports PER DAY. The equivalent of a medium sized city being reported for CSAM materials every day! I don't know how many people work in government handling those reports, but I'd imagine the process of taking in reports, evaluating them, forwarding them to appropriate authorities, and then making arrests is not one that can possibly be automated away to deal with that kind of volume. And Apple would generate at least as many reports as Facebook I'd imagine, not to mention all the other sources of CSAM reports that are out there. Do we really think there's anyone in law enforcement who is saying "oh gee, if only we had an ADDITIONAL 55,000 CSAM reports coming in PER DAY then we'd really be able to get a handle on the problem." If anything, that just provides even more noise, making it harder to find real signals.

So now that they've shown they're willing to call law enforcement's bluff, I think law enforcement predictably has now said to them ok, it's not really about the CSAM, we still want backdoor access for other reasons, and now Apple is reevaluating their response.

The issue for Apple though is that even if they're legally allowed to say no to law enforcement requests to open up backdoors, they're probably being extorted by law enforcement threatening to come down on them for antitrust issues if they don't go along with it. Smaller companies that don't have antitrust issues to worry about would be able to put up a much more resilient fight.


I have a pretty certain suspicion that this whole debate is fake. Here is why:

They already have backdoors in the hardware! The googles and apples and so on, the big companies already willingly work with them! Etc.

Apple is happy to pretend they are on the consumers side, fighting the demands, in order to roll back the damage that the Snowden revelations have done to the collaborators. In reality it is the same as before: business as usual. What does it matter even if the device is truly encrypted? They've got all your info while you were using it anyway. This way the people they don't want to have it can't get it, but they still have it.

This debate about adding backdoors, this pretending to care that backdoors are added, is nothing more than an attempt to fool you and I that they don't already have them!

This isn't to say that if they win these fake debates they won't also use such an opportunity to make illegal things like TOR and Freenet as they have made attempts to circumvent copyright protection illegal. This will be so that if secure hardware does arise, they won't have to worry much about it since the software will be illegal and already repressed and thus retarded since made illegal.


Read through it all, it still comes down to "trust us". Apple can sign and authorise an update at any time that will backdoor it, and the government is the stroke of a pen away from forcing them to, all completely silently.

I get that there's benefit to what they are doing. But the problem of selling a message of trust is you absolutely have to be 100% truthful about it, and them failing to be transparent that people's data is still subject to access like this poisons the larger message they are selling.


I think I missed the part where anybody asked Apple to build a backdoor into every phone that could be accessed without appropriate control from the authorities and without passing through Apple each time. Of course I'm not saying that your data should be uploaded daily to a government's server for anybody with a badge and free time to spare to look through.

Ok but now you’ve said that the precedent established by Google and others already moved the legislation to require terrible invasions of privacy far along. You started by saying Apple’s technology (and, in particular, its framing of the technology) has brought new legal risk. What I’m instead hearing is the risk would be present in a counter factual world where nothing was announced last week.

At this point of the discussion, people usually pivot to scope creep: the on-device scanning could scan all your device data, instead of just the data you put on the cloud. This claim assumes that legislators are too dumb to connect the fact that if their phone can search for dogs with “on-device processing,” then it could also search for contraband. I doubt it. And even if they are, the national security apparatus will surely discover this argument for them, aided by the Andurils and NSOs of the world.

As I have repeatedly said: the reaction to this announcement sounds more like a collective reckoning of where we are as humans and not any particular new risk introduced by Apple. In the Apple vs. FBI letter, Tim urged us to have a discussion about encryption, when we want it, why we want it, and to what extent we should protect it. Instead, we elected Trump.

next

Legal | privacy