Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

You keep saying they can scan for whatever they want but that’s not true, today, by Apple’s description.(which is all we have to go by and is what you are mad about)

Yes the government could order them to change the system. They could also order Apple to create the system in the first place without all the indirection, safety vouchers, human review, etc which make it inefficient as a direct surveillance tool.



sort by: page size:

The post you responded to already addressed that exact point.

>And yes, it's true that the governments could always mandate such scanning before. The difference is that it'll be much harder politically for Apple to push back against tweaks to the scheme (such as lowering the bar for manual review / notification of authorities) if they already have it rolled out successfully and publicly argued that it's acceptable in principle, as opposed to pushing back against any kind of scanning at all.

>Once you establish that something is okay in principle, the specifics can be haggled over. I mean, just imagine this conversation in a Congressional hearing:


As I understand, government can't force a company to implement on device content scanning. However, once company creates such functionality voluntarily, they can make company to scan all sorts of things and lie about it.

This is where Apple is crossing the line.


Apple has not been ordered by the government to scan user's images.

Because the scanning is happening without a warrant, if the government compelled apple to perform the search the search would be illegal. It is only legal for Apple to perform this search because it is in no way compelled by the government.

The government cannot obtain a blanket warrant against everyone. This kind of dragnet scanning has to be voluntarily performed by a private party for it to be lawful.

By all means! please prove to us that the government secretly has ordered Apple and other companies to scan users private data: If you do so it will result in overturning tons of convictions due to the unlawful searches which were concealed due perjury by the government and tech companies who have consistently claimed that the scanning by the tech companies is completely voluntary in in their own self interest.


According to the article, Apple is responding to government and law enforcement pressure to implement the scanning. The state is just laundering its desires through private corporations; there must be a word for this type of system.

What is your actual point here? It feels like we’re just playing a game if hypotheticals that are no longer based in reality.

Sure Apple could update your device to send all your photos unencrypted to them. They could also remotely turn on the mic and spy on all of us. They could also add key word detection to iMessage and flag law enforcement if you text out the wrong words.

I think everyone here understands what Apple could do. Which is why it’s a good thing that signs point to Apple not wanting their customer data. And why Apple refusing government orders that they feel violate their customers is unequivocally a good thing (even if they’re doing it for selfish reasons)


This exactly.

I am mostly convinced based on the technical details that have (slowly) come out from Apple that they have made this system sufficiently inconvenient to use as a direct surveillance system by a malicious government.

Yes a government could secretly order Apple to make changes to the system, but they could also order them to just give them all of your iCloud photos and backups, or send data directly from the Camera or Messages app, or any number of things that would be easier and more useful for the government. If you don't trust your government don't use a mobile device on a public cell network or store your data on servers.

But all of that said there is a still a line being crossed on principle and precedent for scanning images locally on device and then reporting out when something "bad" is found.

Apple thinks this is more private than just directly rifling through your photos in iCloud, but I can draw a line between what is on my device and what I send to Apple's servers and be comfortable with that.


This system is a backdoor. We should not be surprised if adding a backdoor reduces pressure to add a backdoor. :)

But in the US it isn't just a backdoor, it's an indirect violation of the user's 4th amendment rights. If the government ordered apple to perform this scanning, the scanning would be unlawful and the result would be inadmissible in court. It is only lawful because Apple performs it voluntarily (or at least that is the pretext maintained in court-- repeatedly Apple staff have maintained to me that the government is requiring them to do this). The government should not get the ability to execute unlawful searches simply because it's able to use soft power to coerce user trusted third parties to perform the searching on their behalf.


There is a fairly large difference, first being it would be a massive damage to Apple's brand if they started scanning people's phones without permission.

But now that they've built the system to scan things on-device, they can be compelled by a government to scan for other things, and Apple can shrug their hands and say they had no choice.


What are you on about? The comment stated that the authorities have physical access to the device. That means that they can see whatever data you have because it's your device showing it to them.

Also, this idea is even more dystopian than the original concept. The original idea was only going to scan content the user chose to push to Apple servers. This current suggests scanning the entire device just because johnny law has the device. That is soooo much worse of an idea.


The governments always could do that, but then they'd be the target of the pushback, and there's already significant mistrust of governments wrt surveillance because of how it can be abused.

What we have now is Apple, with its "strong privacy" record, normalizing this. If it succeeds, it would be that much easier for the governments to tackle other stuff onto it. Or, say, lower the threshold needed to submit images for review. I can easily picture some senator ranting about how unacceptable it is that somebody with only 20 CSAM photos won't be flagged, and won't somebody please think of the children?

And yes, if it comes to that, Apple definitely cannot hold the line. After all, they already didn't hold it on encrypted cloud storage - and that wasn't even legally forced on them, merely "not recommended".


The biggest concern about Apple’s system is that they are showing to all governments and everyone that it’s fine and good to scan for whatever on my device and report me to the government if they see fit, despite years prior refusing to implement backdoors or give access to someone’s device to the FBI.

They essentially invalidated all those claims and I can’t see how they’ll now be able to argue back if the US or the Chinas come to Apple saying they have to have more surveillance in their devices.


You are confusing generalised surveillance with an action taken as a result of a warrant.

What's being asked for in no way allows the government to view your private records without a warrant.

The FBI didn't ask Apple to decrypt the device because that would be impossible.


What they built is a way of scanning things on your phone and reporting that to Apple. The chance of multiple governments not passing laws eventually to force this into scanning for whatever they wish is low. Previously to this Apple could have fought back on privacy terms but now its argument will be much weaker.

edit: It's also a model based scanner so they scan for types of things and similar things instead of explicit copies of things. Which makes it an even more powerful tool for governments than a simple direct scanner.


The proposal is the exact opposite of how you characterize it.

> Apple just side-stepped the question on how to reconcile trust in propietary software and surveillance requests from the government.

They actually inserted themselves into the process, not side-stepped it. They could’ve kept status quo (no scanning) and kept supplying iCloud Backups to governments at large rates. As part of this, they’re also forced to add a greatly increased human review function.

> We don't even know what big brother wants to know about you.

They’ve explicitly stated what they’re scanning for and went further to say they will not scan for other types of content. You may not believe them, but this is not a content-neutral or blind justification.


Take your argument one step further - the government knows this and starts issuing warrants for exactly these situations. Now, Apple is in a position where they have to violate their user’s privacy and share content in order to comply with the search warrants. The new system allows them to comply with warrants without ever knowing or needing to know what the contents of users’ phones are and only in cases where it’s known that illegal content exists. Apple can’t divulge any other content from the users’ devices because it doesn’t have it or know what it is in the first place.

Apple: “we aren’t going to scan your phone”

>>

Apple: “we are going to make a tool that can scan your phone”

>>

Apple: “Sorry, the government is forcing is to use this tool to scan your phone”


The decision of Apple is bad for the simple fact that it assuming that everybody is suspected to be guilty until you have been scanned, inspected and proven clean. Until now you would have to be under an investigation for your data to be monitored/inspected. But now everybody is a suspect and has to be under scrutiny, this is really bad and will lead to abuse of the system in the future.

I respectfully disagree. In my mind, the fact that Apple employees do first level filtering is a minor problem.

The main problem is the precedent this sets: on device scanning is now possible.

Before this, if a government asked Apple to scan all the phones for something, Apple could say "we're sorry but we don't have the capability " and they could not be compelled by legal means.

Now, a large part of that argument has been eroded. Apple may have added in a few hurdles, but the first crucial step of on-device surveillance capability has been installed and is on path to being normalized.

It doesn't matter that they don't do this yet. We are undeniably closer to direct phone surveillance than we have been before.


Wanting to know as a parent, and the way Apple is going about this are two different issues.

The government also wants to know about potential terrorist attacks. Why not scan all our phones for all kinds of data to protect innocent people from being killed by mass shootings?

That's nonsense. I'm saying that and I'm deeply locked in Apple's ecosystem. Which is pissing me off.

next

Legal | privacy