Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login
Apple says it will refuse gov’t demands to expand photo-scanning beyond CSAM (arstechnica.com) similar stories update story
29 points by samizdis | karma 33647 | avg karma 6.61 2021-08-09 10:24:06 | hide | past | favorite | 28 comments



view as:

How much is this promise worth?

In flip-flop, Apple bans app used by Hong Kong protestors - https://arstechnica.com/tech-policy/2019/10/apple-approves-t...

Apple deliberately places itself in a position that make their users powerless against them, then cries when governments predictably force them to abuse that position.


Or see the trouble with Belarus and Apple's decision to support the oppressive regime. Apple demanded that Telegram block protestor channels on the app for iOS.

https://globalvoices.org/2020/08/18/how-one-telegram-channel...


Yeah, I simply don't believe them.

It's /plausible/ that they refuse Hungary or Saudi Arabia. But China can shutdown Foxconn and then Apple won't be able to sell iPhones.


doesnt matter. lets say the government wanted to find a data leak. they could hash the leaked document and falsify it as cp. its dumb. there is no excuse for apple to be hashing and uploading the hashs

Oh, like national security letters [1] ? GTFO.

[1] https://en.m.wikipedia.org/wiki/National_security_letter


Google is obsessed with surveying me on why I keep responding that I feel like my data is not secure with them. It's 100% backed in the fact that if the government NSLs Google, there's nothing Google can do about it. As long as NSLs exist, privacy and data security don't either.


Can Apple include that promise in the EULA so that it is (at least potentially) legally binding? Otherwise their words are little comfort.

Newspeak, not photos, just all kind of files on your device.

They literally cannot. They will just be Lavabit[1] until they comply.

[1]: https://en.wikipedia.org/wiki/Lavabit


See also: "the government argued that, since the 'inspection' of the data was to be carried out by a machine, it was exempt from the normal search-and-seizure protections of the Fourth Amendment."

Are they ready to pull out of China if the CCP comes calling?

You should ask that question at their next shareholder meeting. I have a feeling there will be fireworks over this topic at that meeting.

"In a call today with Apple, we asked if China demanded Apple deploy its CSAM or a similar system to detect images other than CSAM (political, etc), would Apple pull out of that market? Apple speaker said that would be above their pay grade, and system not launching in China."

https://twitter.com/josephfcox/status/1424822688070131721


It would be better if Apple refused to implement client-side scanning altogether.

Making a claim we know they cannot uphold is pointless.

Oh and it's time to update this ad:

https://www.businessinsider.com/apples-ces-ad-las-vegas-misl...


I do wonder when they'll come for our Android devices next?

who is "they"?

Those who would silence those who speak out against repression?

Every orange alert I’ve gotten so far has been a father in violation of a custody schedule.

(if this seems non-sequitur, you haven’t looked into it)


...until it doesn't.

How do we know the first batch of scanning is only scanning for what they claim? Once the system is in place, providing new hashes or even doing more detailed in image recognition becomes much easier.

Really we need another alert level where every user is required to take their mobile phone out of their pocket and do facial scans of everyone around them. Expand the panopticon to hundreds of millions of devices in realtime. Could solve a whole lotta crime.


The question is why Apple implemented this feature in the first place. There was no reason for them to suddenly expand their image scanning to the devices themselves and risk their position as the self-proclaimed saviors of privacy - and still they did exactly that. There had to be some push from the government behind all of that, otherwise this debacle just doesn't make any sense.

Hard to believe without seeing the pinky swear

> Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government's request to expand it.

That seems like such a bizarre line to draw. Why would you spend all of this time and effort building such technology to only fight child sexual abuse? There are adult victims of abuse and trafficking. There are murders and rapes that Apple could help fight with this tech. Stolen items or stolen pets could be identified and returned to their rightful owners (for example, scan all phones within 20 miles of a lost dog).....why is Apple refusing to help fight any of these legitimate crimes?


There's actually a reasonable answer to that: just having images of other kinds of crimes generally isn't a crime by itself.

The government won't have to demand the expansion. Apple may very well decide to expand it themselves, to scanning for pirated movies and music. They make money from that, unlike from scanning for CSAM. Of course it will also expand to political "disinformation" and anything else.

Awesome. Surely they’ll back this up with a contract with high penalties so this doesn’t end up like the many empty promises not to betray their users made by corporations.

Legal | privacy