Apple deliberately places itself in a position that make their users powerless against them, then cries when governments predictably force them to abuse that position.
Or see the trouble with Belarus and Apple's decision to support the oppressive regime. Apple demanded that Telegram block protestor channels on the app for iOS.
doesnt matter. lets say the government wanted to find a data leak. they could hash the leaked document and falsify it as cp. its dumb. there is no excuse for apple to be hashing and uploading the hashs
Google is obsessed with surveying me on why I keep responding that I feel like my data is not secure with them. It's 100% backed in the fact that if the government NSLs Google, there's nothing Google can do about it. As long as NSLs exist, privacy and data security don't either.
See also: "the government argued that, since the 'inspection' of the data was to be carried out by a machine, it was exempt from the normal search-and-seizure protections of the Fourth Amendment."
"In a call today with Apple, we asked if China demanded Apple deploy its CSAM or a similar system to detect images other than CSAM (political, etc), would Apple pull out of that market? Apple speaker said that would be above their pay grade, and system not launching in China."
How do we know the first batch of scanning is only scanning for what they claim? Once the system is in place, providing new hashes or even doing more detailed in image recognition becomes much easier.
Really we need another alert level where every user is required to take their mobile phone out of their pocket and do facial scans of everyone around them. Expand the panopticon to hundreds of millions of devices in realtime. Could solve a whole lotta crime.
The question is why Apple implemented this feature in the first place. There was no reason for them to suddenly expand their image scanning to the devices themselves and risk their position as the self-proclaimed saviors of privacy - and still they did exactly that. There had to be some push from the government behind all of that, otherwise this debacle just doesn't make any sense.
> Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government's request to expand it.
That seems like such a bizarre line to draw. Why would you spend all of this time and effort building such technology to only fight child sexual abuse? There are adult victims of abuse and trafficking. There are murders and rapes that Apple could help fight with this tech. Stolen items or stolen pets could be identified and returned to their rightful owners (for example, scan all phones within 20 miles of a lost dog).....why is Apple refusing to help fight any of these legitimate crimes?
The government won't have to demand the expansion. Apple may very well decide to expand it themselves, to scanning for pirated movies and music. They make money from that, unlike from scanning for CSAM. Of course it will also expand to political "disinformation" and anything else.
Awesome. Surely they’ll back this up with a contract with high penalties so this doesn’t end up like the many empty promises not to betray their users made by corporations.
In flip-flop, Apple bans app used by Hong Kong protestors - https://arstechnica.com/tech-policy/2019/10/apple-approves-t...
Apple deliberately places itself in a position that make their users powerless against them, then cries when governments predictably force them to abuse that position.
reply