Read through it all, it still comes down to "trust us". Apple can sign and authorise an update at any time that will backdoor it, and the government is the stroke of a pen away from forcing them to, all completely silently.
I get that there's benefit to what they are doing. But the problem of selling a message of trust is you absolutely have to be 100% truthful about it, and them failing to be transparent that people's data is still subject to access like this poisons the larger message they are selling.
The big problem in my view is, earlier the Govt had to get their hands on the device to do something about the data even if that means forcing Apple's hand legally. Now, with this, the Govt will force Apple's hands once and they are in. I don't think it's about whether that will happen, it's about how soon that will happen. Does Apple know it? Do they know that the Govts or at least the US Govt (it's a US company after all) is going to make them bend and they devised deniability in advance? And when that happens, and it becomes public, they'd just say - you see, we really tried.
As others have said - it's a super slippery slope, not sure intentional.
Apple will obey government orders to give data they have and can access.
No government order short of targeting a specific backdoored update to a specific person will allow them to give data they can't access.
And if you're doing something that can make a TLA force Apple to create a targeted iOS update just for you, it's not something regular people can or should worry about.
Apple keeps normal people safe from mass surveillance, being protected from CIA/NSA required going Full Snowden and it's not a technological problem, you need to change the way you live.
I'm not OP, but the gist of his argument is you cannot be secure against the government if you cannot be secure against Apple first. If Apple continues to force people to trust it, the government will subvert this trust via legal means.
I find the argument quite congent. It's not getting lost in the weeeds, but generalizing the problem; instead of just fixing this bug, why not go ahead and fix the whole class of possible bugs?
You missed the point. They claimed it was impossible for them to help the government obtain data off the devices. It clearly was not. When this was discovered, they simply disappeared this false claim from their marketing material (privacy page) without even admitting to their lie. Can you name any other American tech company that has lied like this and then covered up their lie? That is why I consider Apple untrustworthy.
Also, there was no backdoor requested, but that is a separate issue. If Apple could install a backdoor, that would make their initial claim even more of a lie.
That isn't the argument at all. What Apple can't have happen without major reputational damage is get caught doing it without telling its customers.
Thus this is seen as a marketing backdoor to the actual backdoor. This is seen as a ploy that later gives Apple the plausible deniability argument that the government is making them do it (otger forms of content) even though they created the means (on device scanning) for the government to exploit.
Why limit it to content that will be uploaded to iCloud? Why limit iMessage scanning to kids? Why not also let the government know?
After the Pegasus news that showed that a relatively small private company in a very small country could have complete remote access to your phone and sell it on the market, do you really think that a powerful government like China or the US can't already do it??? How could you be this naive? Especially after the Snowden revelations.
And about Apple's having a change of hearth in the future and using something like this against you. Have you stopped to think for one second and consider that they have complete control over the updates that your phone receives??
They can quietly do what ever they want without you ever knowing and you can't do sh*t about it. So drop the concerned citizen act. We all abdicate privacy a long time ago when we sold our souls for cool free services like google and sexy gadgets.
At the end of the day the "slippery slope" argument boils down to one thing: do you trust your institutions? Any law or service can be abused, and the only thing preventing that is the integrity of the people on your society and the institutions both public and private that they work for. The exact same law that is used to fight corruption in a great country like Norway or Denmark can be abused to persecute political opponents in a shitty country like North Korea or China.
If the answer is no, I don't trust my institutions and my people, then you have a far more severe and fundamental problem than an specific law or service. In that scenario, focusing in an specific issue is like trying to cover the sun with your hands.
One, software sucks and is full of security holes, so whether or not Apple intends to leak any other data is irrelevant, because other data will leak regardless.
Two, you're assuming this is the only thing they are going to intentionally do with your data. That will change. In an infinite future, things change. Every decision is eventually overturned, and it's luck of the draw which direction that change is in.
Three, the U.S. defence apparatus has proven time and again that they will violate the law and use whatever means necessary to spy on who they want, and this is another kitchen window for them to climb through.
I think it's just PR bullshit. Apple has been known to abuse the law. I'm sure some government gave them the green light to lie about this and have another backdoor.
We are just supposed to trust Apple. It's a big problem with the world we've created, we have these unaccountable (to the public) private entities. Their interests are not our interests. We know they share data with the US government, since the PRISM revelations. There's nothing that makes me think that has stopped.
> "Apple has never worked with any government agency from any country to create a “backdoor” in any of our products or services. We have also never allowed any government access to our servers. And we never will."
IMO Turning over data they have stored in response to government requests while working to reduce the data they have access to is rather different from using it to build profiles about their customers and sell their eyeballs to third parties.
I understand the hesitation here, but fundamentally this is like trying to close pandoras box. If something is technically possible to do AND governments demand it be done, it will be done. If not by Apple, by someone else.
Rather than complain about it, I am interested in what alternative solutions exist, or how concerns regarding privacy and abuse of this system could be mitigated.
What I respect about Apples approach to this is their commitment to not being able to access your data themselves anyway. That way the NSA can hit them with whatever secret court orders they like, Apple cannot help them. They’re not completely there yet, but they appear to be doing it and nobody else seems to be even trying.
I think I missed the part where anybody asked Apple to build a backdoor into every phone that could be accessed without appropriate control from the authorities and without passing through Apple each time.
Of course I'm not saying that your data should be uploaded daily to a government's server for anybody with a badge and free time to spare to look through.
But what im getting at. Is that this is exactly what Apple is trying to fight.
A government could already coerce Apple into handing over iCloud data.
The cryptography at play here, combined Private Set Intersection and Threshold Secret Sharing are clear steps to make it as hard as possible for any institution to body this for that reason.
"Do keep in mind that at least one govt has successfully pressured apple to give up on its privacy"
No company can defend you from your government.
"All it would take"...
That is the slippery slope. If a government is going to say "that's a nice looking hashing system you have there, now we need you to..." they could as easily -- more easily -- have said "that's a nice filesystem you have there, we need you to...".
Hashing files and comparing them against a list is literally a college grad afternoon project. There is absolutely nothing in Apple's announcement that empowers any government anywhere in any meaningful way at all. It is only fear-mongering (or simply raw factual errors as seen throughout this discussion) that makes it seem like it does.
I really wish the people – especially those on HN – would take a broader look at what Apple is proposing and better understand the forces at play before being so critical of the tech. I understand the initial knee-jerk anti-privacy response, but since there has been some time for people to learn all the facts, I remain amazed that they never come up in these threads.
First, you hear a lot of people, including Snowden (while contradicting himself), say this isn't really about CSAM. That point is absolutely correct. This is ALL two things, each addressed here:
1. Legal liability, and the cost of processing as many subpoenas as they do.
Ultimately, Apple has the keys to the data they store on their servers. They could easily encrypt all the data using on-device keys, before uploading to ensure they can't actually see anything. But this would cause a huge backlash from law enforcement that would cause congress to pass legislation mandating backdoors. In fact, Apple (big tech) has been trying to hold off that legislation since at least 2019, when they met with the Senate Judiciary committee [1].
Quote from EFF article:
> Many of the committee members seemed to arrive at the hearing convinced that they could legislate secure backdoors. Among others, Senators Graham and Feinstein told representatives from Apple and Facebook that they had a responsibility to find a solution to enable government access to encrypted data. Senator Graham commented, “My advice to you is to get on with it, because this time next year, if we haven't found a way that you can live with, we will impose our will on you.”
Apple is doing exactly what Graham told them to do. They have come up with a system that manages to increase security for most users by ensuring that nobody - not even Apple - has the decryption keys for your data, while also satisfying law enforcement to the degree necessary to prevent really harmful anti-privacy legislation. They managed to do it in a really creative way.
It's not perfect of course. There are plenty of people with valid concerns, such as the potential for hash collisions and how a country like China might try to abuse the system and whether Apple would give into that pressure (as they did in the past). All of that is valid, and I'm glad to see Apple stop and examine all the complaints before pushing the release. But strictly on the topic of privacy, the new system will be a massive improvement.
2. User privacy. Yes, everyone thinks this is an invasion of privacy, but I just don't see how. The proposed on-device scanning solution provides MORE privacy than either the current iCloud system (in which Apple can be compelled to decrypt nearly all of your data) or the proposed [2] legislation – MORE privacy even for people found to meet the CSAM threshold!
It seems to me there must be a lot of misunderstanding surrounding the encryption mechanisms Apple has proposed. But having read the technical documents, my view (again, strictly from a privacy standpoint) is that it appears to be extremely sound.
Essentially, there are currently two parties that can decrypt your iCloud data with master keys – you and Apple.
In VERY greatly simplified terms the new system will set one master decryption key on your device. But Apple will now instead use shared key encryption, which requires ALL of the ~31 keys to be present to decrypt the photos. Apple will have one of those keys. The other 30 (the "threshold") keys will be generated by a hash (of a hash of a hash) of the match found in the CSAM database. If no match is found, then the shared key needed to decrypt that image is never generated. It doesn't exist.
One way to look at this is that it's the CSAM images that are the keys to unlocking the CSAM images. Without them, Apple cannot comply with a subpoena (for photos ... for now). Even people who meet the CSAM threshold, can only have the CSAM images decrypted. All other photos that have no match in the CSAM database cannot be decrypted without access to the suspect's phone.
On the flip side, Apple is bending to congress's demands by voluntarily sharing information with law enforcement. I can absolutely understand how this behavior could make even perfectly innocent people feel uncomfortable. But in the context of the understanding that you have more privacy for yourself, while exposing those who deal in CSAM (and are dumb enough to store it in their iCloud account), I have to push my logical understanding to overcome my natural but unwarranted discomfort. Anything that prevents the government from getting universal backdoor into everyone's phone is a win, in my opinion.
The US gov already has access to all the photos on iCloud, can already force Apple to seed backdoors here and there and forbid them to disclose the request. We had the FBI case where Apple refused to cooperate, but it only stops there because there was other ways to do get to the same result.
So the basic premise of this discussion is that the US gov isn't already abusing its power to snoop on everybody all the time. If we give up this premise, all the discussions make a lot less sense and we're just arguing about different ways to access the same data.
(credits to John Siracusa on ATP for wording very clearly this elephant in the room)
I don't buy into this line of reasoning, because it inverts the relationship. Of course Apple has to hand over access to authorities if there's a lawful reason behind it, just like e.g. a private person has to oblige to a search warrant. But in this case they hand over their infrastructure to an untrusted party in order to conduct business in the first place.
Again, it's not so much about what the firm can actually do with the infrastructure (ideally, it's a black box that only the user can access with the correct key), but what compromise they're willing to make in order to conduct business. Privacy has become a selling point for Apple, and actions like these signal how serious they are about it.
I get that there's benefit to what they are doing. But the problem of selling a message of trust is you absolutely have to be 100% truthful about it, and them failing to be transparent that people's data is still subject to access like this poisons the larger message they are selling.
reply