Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login
Apple's bright idea for CSAM scanning could start persecution on a global basis (www.theregister.com) similar stories update story
41 points by LinuxBender | karma 53794 | avg karma 3.11 2021-08-20 07:42:14 | hide | past | favorite | 207 comments



view as:

Does anyone know Apple's internal motives and why they enables CSAM in the first place? They must have known the consequences, PR and otherwise.

A wise financier once advised me to “never underestimate the stupidity of management”

its probably not their real internal motive but i think it is somewhat interesting that they've done it in a way that was publicly discussed, sparking valueable debates over this and similar practices and forcing people consciously and even governments publicly to decide upon if they like this or not. I also think it could be a nice way out of this negative PR if they just claim they wanted to open the debate over this apparently not so uncommon practice of scanning user data on-device or not...

/shrugs

Companies try to do stupid things and end up with egg on their face all the time.

They'll backtrack from this and people will forget it ever happened in a few months.


They absolutely won't backtrack from this, the damage is done and it was hardly noticed by mainstream news outlets.

Peoples goodwill and trust towards Apple used to be overwhelming and viral on hn and in privacy communities. I don’t think that’s as easy to get back as people are saying - what they had was rare and powerful

What use is all that goodwill if a solution which attempts to thread some incredibly complex trade offs in a way that innovates on the status quo in every possible way is met with skepticism that is rooted in mistrust of government and not Apple (ignoring the even greater numbers of ignorant or false concerns that have been posted)?

> what they had

I am unconvinced they've lost it. There is a lot of noise right now, but how much of that is coming from people who already hated Apple?

Out in the real world, even people who've heard something about this largely don't care. It's a tempest in a teapot -- granted, a little larger than usual, but it's easy to fall into the fallacy that everyone else should care the way HN does about something. And this topic seems to be pretty divisive, almost political in nature.


Trust was the advantage they had over FB, who will be their main competitor in AR.

Kinda reminds of GOT and the biggest mistake of the "good guys", they left the dead dragon in the ice which then later burned down the wall and accomplished the "impossible".

FB might use this to crush Apple in the future.


Agree.

Remember, Google was so nice and held in so high esteem that even valid criticism could be downvoted mercilessly.

Today you can run a karma farm on criticizing Google here on HN.

Google had mountains of goodwill, but with smart people, dedication and hard work one can burn true it like they did.

I'm afraid Apple can manage this too...


My unproven theory is that they're moving away from tech savey users (who won't like this) towards soccer mums (who want this sort of thing and all the parental control stuff they've been doing recently and the weird no porn on tumblr thing etc).

The second market is bigger and less discerning. Its a purely economic decision: come and bring your kids to our nice safe child-proof walled garden.


Given that they have over a billion users, they are already a mass product. Was iPhone ever aimed at "tech savvy"?

Well I had a totally batshit-wild theory so feel free to dismiss this...

Remember those NSO iOS security revelations from a few weeks back? It might be quite possible that the government(s, specifically the US) are handling some information over in exchange for this. It would make sense for Apple to reach out to the government and get information about some of the vulnerabilities in exchange for brownie points, because this CSAM system is totally controllable by Apple while unknown security issues aren't.

Now whether Apple designed this only with CSAM and actually ignored/forgot its possible issues.. that's another question.


Wouldn't this system have taken more than a few weeks to develop?

Valid point. They might have been developing the tech for a while, perhaps this was an opportune moment to release it? Anyway I'm just making a wild guess (as I had mentioned), and I'm not really sure why people were surprised when something I'd mentioned as a wild theory, was, in fact, a wild theory. I'm almost certainly wrong but it's still a theory.

Note: not defending Apple's implementation here.

> why they enables CSAM in the first place?

Apple's privacy measures, such as not scanning your Cloud photos, is what helps enable CSAM.

Sexual abusers very often take photos and often upload these to their communities, and Apple has given them a secure device with which to do that. This is becoming increasingly widespread — the number of reported CSAM material grew by more than 50% last year, to nearly 70 million images and videos [1].

Due to Apple's privacy, the numbers look like this: Facebook reported over 50 million combined images and videos, Google reported 3.5 million, Dropbox, Microsoft, Snap, and Twitter over 100,000 images and videos. Apple reported only 3,000 photos in the same period, and no videos. [1]

Sexual abuse is experienced at some point in childhood by around 1 in 9 girls, and 1 in 53 boys. 93% of perpetrators are known to the victim. In 2016, CPS substantiated or found strong evidence to indicate sexual abuse of 57,329 children [2]. That's CPS, meaning in the US alone.

> They must have known the consequences, PR and otherwise.

The PR consequences of that have gone largely unnoticed, interestingly, and it's more so their attempts to curb it that are getting flak from the media.

I suspect this is because we can't create statistics on what we can't detect. There can be no "Apple is enabling, and allowing to continue, the sexual abuse of 50,000 children a year."

[1] https://www.nytimes.com/2020/02/07/us/online-child-sexual-ab... [2] https://www.rainn.org/statistics/children-and-teens


First of all, the topic is CSAM, so images videos and the like, not general abuse. I imagine CSAM production is substantially more rare since abuse is a precondition.

Secondly, I don’t believe those statistics at all. You’re telling me 11% of girls have been sexually abused? That’s preposterous. There’s gotta be some selection bias or misreporting.


I edited the post to add images and videos statistics — it's frighteningly common.

The numbers are correct and it's shocking and heart-wrenching; here's a linked study [1], but there are many.

I went to college in Boston, where the Boston Globe broke a scandal that around 150 Catholic priests in Boston were accused of sexual abuse, with more than 500 victims [2] [3].

It was immensely tough to grasp and accept that in a small city like Boston, that many priests were sexually abusing children.

[1] http://www.unh.edu/ccrc/pdf/9248.pdf [2] https://www.theguardian.com/world/2010/apr/21/boston-globe-a... [3] https://en.wikipedia.org/wiki/Catholic_Archdiocese_of_Boston...


Abuse and assault are distressingly common and underreported. The numbers I'd heard from an acquaintance who does counseling were higher.

None of the stories I've heard about would remotely have been helped by this tech.


You might try actually talking honestly with some women about this some time.

Every woman I know has a story to tell. Nearly all of them would make you weep.

We could do so much better, but then people like yourself just straight up don’t believe it.


Thanks for the downvotes. That will surely change my mind rather than polarizing me even further against these abuse claims.

"Apple's privacy measures, such as not scanning your Cloud photos, is what helps enable CSAM."

this is what baffles me.

I'm sure a good number of iphone customers have their photos automatically uploaded to icloud.

For all intents and purposes, the end result is the same for most people. Apple is scanning your photos, and doing so on your phone feels much more intrusive.


I completely agree.

For what it's worth, Apple has nothing to gain with this much of a privacy stir, especially when they're accomplishing the same level of detection that other companies accomplish with the simple cloud scanning.

While it's possible that something nefarious is going on with Apple and the U.S. government, it's more likely that Apple got really 'big-brained' about this, had a lot of talks of "we want to retain privacy" and the iCloud team pushing against scanning, and so they ended up with this as a "more private" solution.


I'm sorry, you clearly are an apologist in your phrasing

> Apple's privacy measures, such as not scanning your Cloud photos, is what helps enable CSAM.

This phrasing implies apple is somehow complicit for not scanning files AT REST. Files at rest are like files sitting in your closet, nobody ever expected them to be automatically scanned throughout all of history in the past, and hopefully aren't scanned in the future without any sort of warrant.

This is why Facebook has more reports - it's a public photo distribution platform.

> Sexual abusers very often take photos and often upload these to their communities, and Apple has given them a secure device with which to do that.

No. Again very disingenuous. That's like saying "Some people abduct kids and Toyota has given them a vehicle with which to do that." No other device scans files at rest. Any form of distribution iMessage, cloud, SMS is already and should continue to be scanned during any sort of distribution so this policy does nothing for that.

In short, No - we don't know why Apple is doing this. It's at best a hollow symbolic gesture against child porn (most of these measures scarcely ever catch anybody for child porn) and at worst a tool that will be misused by malicious governments in some form.


> I'm sorry, you clearly are an apologist in your phrasing

This frankly sounds like some authoritarian ideologue phrasing.

I disagree with Apple's choice here, but the commenter asked how Apple enables CSAM.

While your response disagrees with my assessment of how Apple enables CSAM, it doesn't prevent any alternative theory.

How do you believe Apple enables CSAM?

> It's at best a hollow symbolic gesture against child porn

I have to address this: It's at best an effective measure against child porn, considering Apple is hundreds of thousands to millions short on reported CSAM images compared to other tech companies. There are clearly drastic privacy consequences and potential future overreach, but you can't clearly analyze a policy without acknowledging its benefits.


One speculated motive is that Apple wanted an End-to-End Encryption system for iCloud, but the FBI put immense pressure in 2018 internally to shut that project down because it could spread CSAM, among other things.

Then in 2020, there was the EARN IT Act proposal, which nearly passed and would have required scanning for CSAM on pretty much every online platform that wanted Section 230 immunity.

Apple puts two and two together, realizes Congress is concerned about CSAM's spread and isn't interested in changing, and still wants E2EE on iCloud. OK, put the scanning client-side, then the way is paved for E2EE iCloud because the FBI's biggest argument against E2EE is neutralized, and so is Congress' argument for EARN IT (which would basically have banned E2EE).


Pressure from government(s). We are just one stupid terrorist attack to make this framework acts as a dragnet for catching “terrorists” and “terrorist sympathizers”.

Since this news broke a few weeks ago, I've seen many comments here suggesting that Apple miscalculated and might be surprised by the push-back they're receiving; that some misunderstanding may have occurred.

I find exceedingly difficult to imagine that one of the most sophisticated companies in the world, with some of brightest minds out there, did not consider and calculate this precisely; that there is any way any of this has come as a surprise to Apple. Extending Apple the benefit of doubt does not seem possible in this case.

Yesterday we saw OnlyFans exit the adult industry. Two weeks ago we saw Apple exit the privacy industry.


A popular theory is that this was motivated by the government somehow, and they had to comply by hiding it under the guise of something else.

It could be the other way around. Interesting thoughts:

https://mobile.twitter.com/benadida/status/14247649237170667...

> I don’t see a huge threat to privacy, but I’m not sure how this meaningfully stops CSAM. My guess: Apple is trying to stave off more heavy-handed gov intervention. Thread:


> Apple is trying to stave off more heavy-handed gov intervention.

This is an interesting and charitable interpretation, but if so, then this tactic will fail.

As others here have pointed out: with this capability in place, the only thing stopping them from total intrusion is now merely policy. And that's scary and dangerous.


Yeah and FYI it is not my position on the issue and adding because I found the thread curious.

I just can’t even begin to imagine what a gigantic company like Apple’s real government relations are like however.


I think what is making people uncomfortable is the realization it's always been policy.

You could argue that it's policy to develop or not develop this feature, but if the feature doesn't exist, you could always refuse the update that provides it, whereas if apple adds tiananmen-square.jpg to this list in China after the feature is out there, then you don't know until you're already screwed.

The list is only updated with new versions of iOS. So if that's a concern wait until new versions are compared against old prior to updating.

Downloading the list outside an iOS update would be a new 'feature' just for China. If we're in that mode, then Apple could also have just added scanning for only China.

Finally, the list is an intersection from at least two jurisdictions.

I think there is plenty of good debate to have around this feature, but I also think it's important to start with the discussing the feature as is today.


I have to disagree - I think with the ease of extending the feature, and the track record of similar features being extended, we need to consider the long term implications. Not doing so in previous instances is how we got UK ISP filters for CSAM extended to copyrighted material, for example.

Apple doesn't even need to add scanning "just for China", all Chinese data is stored on the servers of a Chinese partner.

Correct, but people keep pointing to China as a state entity that will abuse the new system.

A lot of people who claim that this feature made it “easy” to scope creep don’t appreciate that Apple escrows everyone’s keys for its “end to end encrypted” messaging service. Who would be able to tell if they dropped a couple extra keys in that list? I can think of dozens of security and privacy claims on iOS that are a simple configuration change away from being totally defeated.

There is big talk of a federal anti-trust against Apple. The rumors have been swirling for months. Apples CSAM initiative might be an attempt to cool things off at the justice department.

https://www.marketwatch.com/story/apples-hot-antitrust-autum...


What does CSAM have to do with anti-trust suits?

What does politics have to do with politics?

Anti child abuse measures are regularly utilized to push for measures you otherwise can‘t establish. Most of the time surveillance and censorship. Just look back 10+ years to Germany, where „Zensursula“ tried to establish DNS blocking for the media industry by arguing for child safety. The approach is always the same. Just this time, Apple has so many devices in the market, that if this succeeds we‘ll all be headed for surveillance disaster.

Given that fact that Snowden’s revelations didn’t generate ANY actions within the public, I doubt to see any actions this time.


Two things.

One is political give and take. As their people negotiate with particular political actors, "We'll give you nothing," works much less well than, "We can't give you X, but how about Y?"

The other is that the tech giants are looking unaccountable and antisocial. Privacy absolutism is popular here, but there are real social costs to it, so it's not popular everywhere. If Apple can say, "Yes, privacy is important, but we're not crazy; we agree we don't want to help out child pornographers", then that is much better positioning.


What politician wins marginal votes on the back of: “I didn’t take on the big tech anti trust monopoly, but I got them to scan your photo library for CSAM!”?

Wrong question. It's which politician can get votes by saying, "I took on big tech and now Apple is protecting our children from predators."

In the United States, there is effectively nobody in this position. The committee hearings on anti-trust and encryption have little overlap between interested parties or ideology.

Just off the top of my head there are a few people that come to mind.

“After the 2016 FBI–Apple encryption dispute, Feinstein and Richard Burr sponsored a bill that would be likely to criminalize all forms of strong encryption in electronic communication between citizens”

Source: https://en.m.wikipedia.org/wiki/Dianne_Feinstein

She was on senate subcommittees that handles all sorts of legality for technology companies.

* Subcommittee on Crime and Terrorism * Subcommittee on Immigration, Border Security, and Refugees

* Subcommittee on Privacy, Technology and the Law

* Subcommittee on Human Rights and the Law (Chair, 117th Congress)

From her gov CV page: “The Judiciary Committee has one of the broadest jurisdictions in the Senate, ranging from criminal justice and immigration issues to antitrust and intellectual property law.“

Source: https://www.feinstein.senate.gov/public/index.cfm/thejudicia...


Feinstein was the head of the FTC’s Bureau of Competition under Obama, a period during which tech anti-trust scrutiny was especially lax. On matters of anti-trust specifically and platform privacy generally she is not the politician who would make the hypothesized deal.

You’re splitting hairs. Feinstein was mostly alone in her convictions but not completely.

Again: > In the United States, there is effectively nobody in this position. The committee hearings on anti-trust and encryption have little overlap between interested parties or ideology.

> there is effectively nobody in this position.

Wrong. There are people and Feinstein is a great example of someone who worked quite hard to kill encryption while also being on various committees handling anti-trust.

If her anti-encryption bill passed or if she was able to push through some anti-trust through the Judiciary committee she would brag about it in her reelection campaign.

Feinstein is perfect model for someone who would gladly tout her legacy while on the judiciary committee and equally her accomplishments on a national security subcommittee.

You are wrong sir, now tuck your tail and go away. I’m done wasting my time with you. But now I know you quite well, you are a narcissist and you have to have the final word. So go ahead and prove me right…


Could it be that Apple thought that child pornography was going to get the most sympathy from society for this general approach of scanning libraries? They may have expected some HN push back but maybe hoped for some halfway decent press related to helping combat this abuse.

I ask that question in curiosity but I'm super skeptical because this is actually surveillance.


The implementation also means Apple has plausible deniability if the “CSAM” in their database actually contains images associated with political enemies of whatever regime supplies the source material. How would you know the hash you are testing against isn’t just a Winnie the Pooh? You really can’t.

Really, it’s probably the best way to keep the police state from destroying your business while trying to sleep at night.


How long has the government been telling everyone to implement back doors into their ecosystems that only law enforcement can access?

Isn't this exactly what it is (if limited for now)?


>> that one of the most sophisticated companies in the world

Microsoft Zune. The Super League. 47 Ronin. Wonder Woman 1984. Google+. Big companies make big mistakes. No matter how many focus groups you collect, no matter how many phone surveys you do, things can go wrong.


I read an argument that Zune was essentially MS showing RIAA et al how bad a product that met all their DRM demands would be. No idea if its true, but I find it interesting.

In that field, the biggest mistake is probably Sony BMG shipping a root kit on music CDs. That certainly showed the world how bad an RIAA-endorsed product could become. It made napster-downloaded MP3s the safer option, accelerating filesharing immensely amongst computer owners.

https://en.wikipedia.org/wiki/Sony_BMG_copy_protection_rootk...


Yes, that whole era was about tech companies trying to balance demands from traditional distributors who had lost their power, and consumers that wanted the freedom they digital media provides. There were many missteps and retrospectively terrible products and business models that were tried before be got to where we are today.

I wonder of the same is true re (broadly) government power. Tech companies are now balancing the power governments and other stakeholders would like them to wield with what people will tolerate. Its not really a surprise there are missteps again.


I think a big difference between Zune and Apple's CSAM issue is that there were people (literally dozens of us!) who LOVED the Zune.

There have been people defending Apple's CSAM policy on hacker news as well. It comes down to how you weigh "small harm on everyone, potentially abusable to serious harm in other countries eventually" vs "major harm to a small number of children".

Clearly the majority opinion here (and mine also) is the first of these is worse, but it's not like I can't understand how people who believe the second is worse came to that opinion - I simply disagree with them.


I have defended Apple because they like everyone else in the industry were already doing this server side. Their getting backlash because they are open about what they are doing not because it’s in anyway unusual.

The actual negative implications are slightly worse battery life and more network traffic if you use iCloud. The upside is people can inspect the perceptual hash. Also, the phone isn’t reporting anything it doesn’t have the database to compare it to.


Software-wise, Zune was lightyears ahead of iPod. If you lost your MP3 collection on your PC, you could plug in your Zune and download the MP3s right off of it. You still can't do that with iPod/iPhone today.

It's been a while, but I think my Zune was directly exposed as a USB storage device when plugged in.

Hey, WW84 doesn’t belong on that list!

- It’s not that bad (60% metacritic, 59%/73% Rotten Tomatoes)

- It at least broke even ($100MM lost theatrically, but during covid, and made up for by new HBO Max signups)

- The third WW movie is still happening


I think most adults you talk to that don't like it, don't realize it was a children's movie. Compare WW's intro in Justice League where she's breaking skulls against walls against the intro in 1984 where her violence is Disneyified.

Indeed. She was open about it being a tribute to Richard Donner’s style of filmmaking and he had what’s now an unfashionable sense of what makes s great family movie.

> It at least broke even

Sure? [1] says budget $200M, all-time worldwide $166M - given marketing of $100M (cheap, normally considered 2x the budget), those HBO Max signups would have to be $134M; which seems like a lot considering they only have about 13M subscribers[2].

[1] https://www.the-numbers.com/movie/Wonder-Woman-1984-(2020)#t... [2] https://www.businessinsider.com/wonder-woman-1984-helping-hb...


Fairly sure, using a similar model. New signups driven by the movie were about four million, so retention and lagging signups only needed to contribute another ~10 million subscription months.

Yeah, [1] gives ~2.8M new retail US subs from end of 2020 to March 2021; that in itself would cover $167M if they signed up for 6 months.

[1] https://www.hollywoodreporter.com/business/digital/hbo-max-s...


https://en.wikipedia.org/wiki/List_of_biggest_box-office_bom...

If it is on that list, there were mistakes made. I just pulled two titles of movie that, beyond being flops, I myself saw and did not like.


I would say it should be there since that list doesn’t factor streaming revenue.

No way; it was Ballmer trying to copy Apple (again) and thinking being a second mover would work out.

Ironically, I attribute most of the Zune's failings to the Apple cult. It had comparable hardware, software (including the UI), and features. Its integration with the Zune subscription service predated Apple's subscription service by almost a decade and Spotify by a few years. The people who gave it a shot loved it, including myself. The main issue was that it wasn't made by Apple and therefore wasn't "cool."

The main issue was that it was made my Microsoft. Microsoft is many things, none of them cool.

> I find exceedingly difficult to imagine that one of the most sophisticated companies in the world, with some of brightest minds out there, did not consider and calculate this precisely; that there is any way any of this has come as a surprise to Apple. Extending Apple the benefit of doubt does not seem possible in this case.

Why is that so difficult to imagine? Apple's security model has always been "just trust us and don't question it", questioning their own practices themselves just has never been done.

That's already what's happening with any other Apple software, their own services are off limit of their model, explicitly excluded & explicitly trusted. This opinion is also reflected in their security threat document they published, they never talk about themselves being in the list of potential threats.

That's just in the continuity on how they usually work.


I think you’re overestimating Apple’s forethought. Remember, they gave us the butterfly keyboard.

The butterfly keyboard was an IBM Thinkpad invention, until other companies copied the design for their own flagships.

It's easy for a large company, especially its leadership, to be out of touch with their customers. Many instances of this are listed in another reply, but it's not all that unusual. I would definitely believe that Apple thought this solution was more private than the kind of scanning Google and Facebook does. Especially now that they're drinking their own cool-aid about being a "services" company where the line between "my device" and "your cloud" starts to blur.

Edit: It's also notable that news of this was leaked before Apple was able to officially announce anything. This means that Apple's marketing department was not able to control the tone/narrative as well as they normally do. The first thing many people heard was "Apple will be scanning your phone" without any of the nuances that came later.


Why so you use the word "cool aid" to describe a fact about objective reality?

iPhone is packed with software literally named "Digital Rights Management"


We've seen a recent wave of big tech companies moving into quasi government roles in relation to censorship and rule enforcement. The apple thing is in keeping with the general trend.

A lot of that is fueled by fear of actual government roles taking charge.

Tech companies have been left unregulated for so long and now governments have noticed how pivotal they are, that tech companies are trying desperately to head off regulation with self-regulation.

Just look at the mixed reactions to GDPR here on HN to see how controversial is it when governments do actually regulate things, it's not hard to understand that tech companies are eager to head that off.


One speculated motive is that Apple wanted an End-to-End Encryption system for iCloud, but the FBI put immense pressure in 2018 internally to shut that project down because it could spread CSAM, among other things.

Then in 2020, there was the EARN IT Act proposal, which nearly passed and would have required scanning for CSAM on pretty much every online platform that wanted Section 230 immunity.

Apple puts two and two together, realizes Congress is concerned about CSAM's spread and isn't interested in changing, and still wants E2EE on iCloud. OK, put the scanning client-side, then the way is paved for E2EE iCloud because the FBI's biggest argument against E2EE is neutralized, and so is Congress' argument for EARN IT (which would basically have banned E2EE).


If you haven't seen it already, this is a good read about EARN it and the issues around CSAM scanning in general wrt the 4th amendment.

http://cyberlaw.stanford.edu/blog/2020/03/earn-it-act-uncons...


I think most people in general think this is a great idea, tech twitter is an echo chamber.

If iCloud is not end-to-end encrypted, why is this necessary? I mean, why can't they scan content on ingress?

1. They don't want to know anything about the result of the scanning until you have 30 matches. This way is much more private.

2. If they scanned iCloud, they could never start encrypting that.


There is also no transparency if the scope or depth of iCloud scanning increases.

I’ve seen a few researchers raise the idea that moving this scanning to the client is effectively a political prerequisite for enabling iCloud E2E, so I suspect that may be coming.

We have no evidence this is true, and Apple has certainly not taken any of the plethora of oppurtunities they've had to make this argument. So there's no reason to believe this to be the case.

Remember when Apple pushed U2's new album to everyone as a "gift" and there was backlash?

I think you're underestimating the degree to which companies have their own internal logic that has nothing to do with the external reality. That tendency increases the larger a company grows. If you're in a one-room house, you can always see outside. If you're inside a giant office building, mostly you see the building.

Did somebody raise the concern of push-back? I'm sure. But the moral questions around CSAM are something that was settled long ago internally. When I was at Twitter fighting abuse, the CSAM stuff was a separate group. My boss called them The Department of Mysteries because we almost never saw them or spoke to them. It was led by a serious person, an ex-FBI agent or something like that. They did what they did and we were all ok with it and grateful for it, because that shit is horrific and we didn't want it on our platform and we didn't want to have to deal with it ourselves.

My cousin was a PO for sex offenders, and one of our regular discussion topics at family reunions was how sex offenders were way more technologically savvy than a state parole department. How they really needed more help in making sure offenders weren't reoffending while on parole, while also not forcing them to just not use computers and phones altogether. If even I've heard this, I'm sure that Apple execs have heard it from law enforcement a zillion times.

It's also clear Apple put a lot of thought into addressing the privacy concerns for this. Technologically, it's sophisticated, impressive.

So I can easily believe the people at Apple said, "Sure, there are reasonable concerns, but we think we have addressed them." And that they're surprised by the level of sustained pushback.


Thank you for this perspective. I've never worked at an organization of this magnitude, so I am definitely lacking some perspective.

> It's also clear Apple put a lot of thought into addressing the privacy concerns for this. Technologically, it's sophisticated, impressive.

I'm not sure about this. How is a perceptual hash sophisticated and impressive given that it can be abused by governments demanding Apple scan for political content, etc?


Apple says their protection against authoritarian governments is that an image must appear in two or more government CSAM databases to be scanned. So if a corrupt politician adds something to the database, it won't be scanned unless a different government adds the same image.

Now, will China follow this? Probably not. But Apple's defense there is that China could have directly ordered them to build this scanning tool at anytime in the last decade anyway. It's not like China has a magical new tool for invading privacy when they could have (and actually have) just ordered tools to be built as desired.


For sure. It's not clear to me whether Apple's approach is truly sufficient to preserve the privacy and safety of iPhone users that whole governments are out to get. But it definitely looks to me like they thought hard about it and have given it a go.

Maybe Apple already built the tool and was like "let's use this tool we already have but just targets specific child porn images". That actually makes them sound more reasonable.. kinda.

>Now, will China follow this? Probably not.

It's quite possible that countries would do the same thing interested parties on both sides are accused of doing with membership and voting in the International Whaling Commission: find countries that have no interest in the matter either way, and buy their support (or in this case, their addition of images to databases), which can be quite cheap when the matters don't actually affect the country.


Was cooperation between two contries considered?

And of course it's absolutely unprecedented for two governments to ever collaborate /s

"Five Eyes" ring a bell?


> I'm not sure about this. How is a perceptual hash sophisticated and impressive given that it can be abused by governments demanding Apple scan for political content, etc?

As it's setup today it can't be. The list is the intersection of at least 2 jurisdictions, and is only checked when items are going to iCloud.

Of course this could change, but it's important to recognize what it is now vs. speculation about what it could become.


A perceptual hash by itself is neither sophisticated nor impressive. Apple could have just applied a client-side perceptual hash and auto-reported you and disabled your device. That is not what they have done, and pretending the solution's sophistication is limited to the sophistication of a percetual hash seems either incurious or disingenuous—I'm assuming the former. Either way, it makes it difficult to engage with your argument.

A searing and possibly legit indictment... what can I say...

> incurious

Hurts to read this. But maybe you're right: my mind kind of shut off when I read the abstract (client side scanning). I can't say I've looked into the details.


If you're not impressed by the perceptual hash, then you should at least be impressed by their implementation of the private set intersection. That's something that not a lot of companies could pull off.

It doesn't make the whole system any less fundamentally flawed, but I think the technical "Wow!" factor certainly helped many people be OK with it. In this view, the shiny new PSI system is a trojan horse for the totally unaccountable and opaque hash set that they've feeding into it.


> How is a perceptual hash sophisticated and impressive given that it can be abused by governments demanding Apple scan for political content, etc?

It's pretty sophisticated when you look at everything implemented and also consider the infrastructure / review pipelines that are required. See the link below:

https://www.apple.com/child-safety/pdf/Security_Threat_Model...

It's based on perceptual hashing, but the whole end-to-end system is clearly sophisticated when operating on Apple's scale.


This. I will add to this a little more. A big company is its own country. It has its own customs. It has its own ruling class. It has its own language.

All those things can make things interesting and sometimes misfire in a form of an idea that is not received that well in the real world.


It's worth looking into the privacy concerns that Apple was actually trying to fix here:

1. Photos (and other documents) are currently uploaded to iCloud un-encrypted

2. These photos are already scanned for CSAM after upload

3. Because the photos are not encrypted, at any point, any government can file a court order to release those photos.

4. The court order can require Apply to not notify the user, or the public in general.

5. [Speculation] Such orders might already exist and be somewhat common within Apple

Apple wanted to fix this and introduce end-to-end encryption on all photos uploaded to iCloud, but scanning for CSAM was non-negotiable (due to internal or external politics?). They must keep doing it.

So they implemented this big mess of a workaround to scan for CSAM before upload and attach a cert with a decryption key only to photos that match so that they could later human verify once a user had enough matches and weed out false positives (which Apple acknowledge will happen) before notifying law enforcement.

Because of the direction that Apple came from, and how much effort they put into designing this system to maximize privacy, they saw this solution as a large privacy win over the existing situation. It's not surprising Apple might have been blinded to the privacy concerns of doing AI scanning of photos on user devices, they were looking at it from the wrong angle.


Was there a recent announced that Apple will implement e2e encryption or is that speculation?

We know they were working on it several years ago, but abandoned efforts over two years ago due to FBI push objections[1]

Maybe they have restarted those efforts, but with some modifications (like this?). It's hard to tell since Apple is a very secretive company, but the exact design of this system does strongly suggest end-to-end encryption of photos. (Why attach a decryption key to positive matches if the photo isn't encrypted?)

I guess it's also possible this was a project kicked of years ago before they abandoned the idea of end-to-end backups and it lived on as a zombie.

BTW, while I see Apple's PoV here, I don't think I agree with it. I think I'd rather they stick with the status quo. I'm curious what other people prefer: un-encrypted and subpoenable iCloud photos, or on-device CSAM scanning?

[1] https://www.reuters.com/article/us-apple-fbi-icloud-exclusiv...


I have a preference for the on-device scanning + e2e Photos. The overall security seems higher, assuming Apple's claims about photos only being scanned during the upload process are true. (If I don't trust Apple to be telling the truth about that, it's not like I could trust them to not already be scanning all my photos, so...)

> Apple wanted to fix this and introduce end-to-end encryption on all photos uploaded to iCloud

Apple has not come out and said this.

This may be what people hope is the motivation, but I would think they would be immediately coming out to say this if it were the case, as it could be used as ammunition against some of the criticism they've been getting recently.


True.

But I'll point out their technical summary[1] explictly talks about attaching decryption keys to positive matches, which you don't need to do if there is no end-to-end encryption.

[1] https://www.apple.com/child-safety/pdf/CSAM_Detection_Techni...


> 1. Photos (and other documents) are currently uploaded to iCloud un-encrypted

No, they are encrypted, just not e2e. Apple holds the encryption key, though.

> 2. These photos are already scanned for CSAM after upload

No, they are not.

> 3. Because the photos are not encrypted, at any point, any government can file a court order to release those photos.

Not because they are unencrypted, but because Apple can decrypt them on demand, since they gold the key(s).


> 1. Photos (and other documents) are currently uploaded to iCloud un-encrypted

Incorrect. Photos and documents are encrypted both in transit and at rest. They are not E2EE, however - i.e. Apple has decryption keys.

> 2. These photos are already scanned for CSAM after upload

Apple does no scanning in iCloud. All of your data (except email) is stored encrypted and it cannot be scanned. Apple escrows the decryption keys, which are provided only to comply with legal requests for user data.

> 3. Because the photos are not encrypted, at any point, any government can file a court order to release those photos.

The second part is true, but the first is not.

> 4. The court order can require Apply to not notify the user, or the public in general.

This is simply the law. If there is no non-disclosure order, it's Apple's policy to notify.

iCloud encryption docs: https://support.apple.com/en-us/HT202303

Legal process docs: https://www.apple.com/legal/privacy/law-enforcement-guidelin...


Encryption during transit and rest really don't count as encryption, at least not in this context. Apple can decrypt them time to do things like scan them for CSAM. I assumed I wouldn't need to state the obvious.

And yes, Apple have been scanning uploaded photos since about 2019. At least, that's when they modified their terms and conditions to allow it.

https://www.macobserver.com/analysis/apple-scans-uploaded-co...


The author of that article way off. First, his evidence is that an email attachment triggered a report, and we already know email is the only data stored unencrypted in iCloud. Email scanning is the most plausible explanation for the ToS update.

If Apple were scanning iCloud Photos, one would expect there to be hundreds of thousands if not millions of reports to NCMEC (last year Facebook reported 20 million). Reporting is compulsory, the information is public, and Apple reported 265 last year. Do the math. Apple is not scanning Photos.


Having worked for a large corporate I frequently saw decisions that did not always make sense, with the justification being that the company was an innovator and market leader, and that they wanted to be like Apple and tell the customers what they wanted. Market research and such were not considered useful.

I am not sure whether to laugh or cry at that. "We want to tell customers what they want. And it turns out what they want is this half-assed thing that will advance the career of the executive pushing for it."

I have my issues with Apple. But they have such a deep tradition of user research and customer focus. When they tell customers what they want, it's often because they have really good data indicating it's what customers do want but don't totally know it yet.


> Yesterday we saw OnlyFans exit the adult industry. Two weeks ago we saw Apple exit the privacy industry.

One of those two statements I disagree with.

Apple exiting the privacy industry would look like this to me: "we've decided from now on that, like almost all other cloud providers, we'll give ourselves access to your stuff for (ahem) legitimate purposes".

Not like this: "we'll implement a neuralhash on the client device rather than on the servers, and then do a cryptographic private-set-intersection protocol with them on the server".

That's a lot of cost and effort to prevent themselves, as a company, misusing the CSAM detector for other purposes. If there are government agencies involved, Apple is also making sure that they can't just use this as a backdoor to get access to everyone's files, it's as if the government said "we need to prevent child abuse, give us a backdoor" and Apple went "ok we'll give you a small backdoor that's ok at detecting abuse images and nothing more" - if the government was expecting to use the backdoor for more than this, they'll be disappointed.

(I'm pretty sure they have other backdoors already, by the way. My guess would be a zero-day on the baseband processor firmware.)

I'm not saying I agree or disagree with Apple's latest move, but "exit the privacy industry" feels a bit a strong statement to me. You have less privacy than you did three weeks ago, and an option on even less privacy in the future (but then again Apple could just change the T&C), but you're still better off than with competitors that offer similar functionality.


In addition, you can have expectations of privacy from far more than state actors that can coerce multinationals into violating unambiguous statement about where they draw a line in the sand.

You can also not use iCloud, and have the same amount of privacy.

I think many at Apple really are surprised by this, and I put it down to a failure of elite consensus.

I think that a number of folks at high levels in the SV executive suite have accepted a manufactured consensus along with their peers in Washington, and that consensus is something like: "people don't care about the privacy of what's on their device and they'll put up with anything to stop CSAM, even if it means scanning personal backups and local files (as opposed to shared files.)"

This seems like a reasonable thing to believe, since server-side scanning of (mostly shared) files has been going on for years and nobody has pushed back very hard on it. But what I think the consensus missed is that the reason for this lack-of-pushback is that nobody in the wider world had really been asked to weigh in on it before. It was something that a few elite tech busybodies were aware of, and most people accepted the idea that providers needed to check out photos that lived (unencrypted) on their servers. Apple accepted this logic and extended it unthinkingly beyond shared photos to unshared private photo libraries on the user's personal device (even if they are staged for backup as part of the iCloud Photos synchronization service, which is just a policy choice.) This was a second mistake because it assumed that because users mostly ignored the scanning of shared server-hosted files, they had somehow given consent to having their private files searched on their device. I don't think they had.

Overall, this announcement is the first time anyone has attempted to have an actual public debate to see how real users feel about this kind of surveillance, particularly automated surveillance of private photos (and an automated system with potential flaws.) Apple's mistake here was to assume that their user base had already given consent -- when they'd just never been asked. It's a very human mistake to make, frankly. The question is whether Apple will listen to their users or if they'll double down and push this through against their users' pushback. I can forgive Apple for misunderstanding their users once, but continuing down this path will be a lot harder to understand.

ETA: To illustrate how much more pervasive Apple's surveillance is than the standard (ignoring the PSI protocols), consider this quote from an EU Parliament briefing: "Others, such as Dropbox, Google and Microsoft perform scans for illegal images, but 'only when someone shares them, not when they are uploaded'." (I can only trust that this is factually true.) In this sense, Apple's move to scan all photos in your library is a significant functional escalation.) https://www.europarl.europa.eu/RegData/etudes/BRIE/2020/6593...


This is a bad take, given that questioning the “elite consensus” is most likely how Apple arrived at this solution. Their proposed system doesn’t take for granted a whole host of assumptions made by every incumbent CSAM detection framework.

I think generally saying "this is a bad take" isn't very helpful. You should elaborate on how in terms of practical functionality Apple's system is really that different than other systems out there.

You argued that somehow Apple blindly followed “elite consensus” to end up with the system they announced. I pointed out that they system they announced questions every aspect of “elite consensus” about how to scan for CSAM in a cloud service. Now you’re waving your hands and saying they’re practically the same when there is a 14 page security threat model on Apple’s website which lists the many ways in which it’s not. How many of the incumbent systems make their perceptual hash available or let clients inspect the data used for manual review in an associated data packet? Where does Dropbox publish its root hash for the known CSAM database? I could go on.

Next, arriving at “practically the same solution” is a necessary but not sufficient condition for having the same assumptions. Your evidence doesn’t even prove your point.

Finally, I explained why it’s a bad take so I’m not sure what you find unhelpful.


I'm a cryptographer, so I do understand the system fairly well. What Apple is doing is applying an automated hash-based image matching system to all photos in your library, and triggering an alert (plus a low-resolution photo) to a human being when 30 photos match. Except for the use of cryptography and the "30" threshold, this is functionally identical to what existing providers do on the server side. As an expert in this field I'm thrilled to see the use of fancy cryptography, but the use of a 2PC protocol does not change the underlying functionality at all.

What is different about the Apple system is that unlike many companies (as of 2019) [0] they are not simply scanning shared photos intended for distribution. They are scanning all photos in your library, even unshared ones. This to me is much more significant than the use of PSI or adding a threshold of 30 reports. You can agree or disagree but it's much more helpful to argue this based on the merits than to be rude about it.

[0] page 8: https://www.europarl.europa.eu/RegData/etudes/BRIE/2020/6593...


This comment has nothing to do with your original theory about elite consensus:

> arriving at “practically the same solution” is a necessary but not sufficient condition for having the same assumptions. Your evidence doesn’t even prove your point.


I think Apple might be held to a higher standard here because they advertise themselves as the privacy-conscious choice?

On Windows, it's kind of routine that everything from your keystrokes to your searches gets used "to improve our products", for all I know Win10 looks at the files on your disk too. They certainly don't offer Apple levels of privacy even for private folders on your Onedrive.

It's also par for the course in multiplayer online games that while the game is running, it monitors your PC for known cheating services - sometimes even when the game is not running, although that still causes a bit of a huff if the program doing this gets caught.


I would argue that this Apple system is not so much surveillance as a censorship mechanism.

They check for a finite set of "bad" things that no one is allowed to have. Because they went so far out of their way to avoid learning anything else about your photos, I think the argument is going to get very messy if we try to argue the surveillance angle. It gets very nuanced very quickly, and public opinion doesn't do nuance well.

It's a censorship tool. This argument is straightforward and easy. China can add Tank Man to the list of bad hashes, and now nobody is allowed to see him. The entire argument is now about what information should be censored, and who do we trust to maintain the badlist.

(Edit: Otherwise I agree with everything that matthewdgreen wrote above.)


I think Apple was primarily caught off guard by how unwilling even technically sophisticated critics have been in any sort of substantive discussion on what they announced.

I know I have been surprised at how little specifics are present in the criticisms that have been published. As an example, if one is asserting that governments can force Apple to do a thing, it seems like one should be able to articulate how, exactly, that would happen.

If it’s pressure under the rule of law (U.S. law enforcement, for example) I would expect organizations who employ lawyers to articulate the laws and precedents that would enable law enforcement to compel Apple to do something they don’t want to do.

And if it is pressure happening outside the rule of law (authoritarians targeting dissidents), I would expect activists to be able to articulate what levers those regimes have to pull against Apple.

I have not even seen basic before/after comparisons explained. Like, if Apple could resist altering their OS for the FBI in the San Bernardino case (and get applauded for it), why could they not resist altering their OS for the FBI in the future (and get mocked for suggesting they could)? How, legally, is a demand to develop a targeted approach to image scanning configuration different from a demand to develop a targeted approach to device encryption configuration?

And if authoritarian regimes can force Apple to search client devices for offending material, why haven’t they already done so?

There is a ton of content that basically starts with engineering topics like “it’s easy to add entries to a hash list” or “it’s possible to find collisions” and just go from there. But engineering is not the issue. What the FBI asked Apple to do in 2016 was not difficult. And it’s not difficult to scan client files on client devices for signatures. It’s not difficult to make software phone home to a server. A simpler version of what Apple announced (scan client files for signatures) would have been trivial for Apple to develop at any time since the iPhone launched. If someone could force Apple to do it, why didn’t they?

There may be solid answers to these questions but that is not the conversation I’m seeing. What I’m seeing seems primarily to be an emotional conversation that builds on people’s ignorance of the current state of technology and the law to power activist engagement.

I think Apple was expecting to announce something technical and legal and deal with technical and legal objections. But what they did was introduce fodder and fuel for what is essentially a set of intersecting social movements around the issues of privacy, censorship, and corporate power. And like many broad social movements, the details are maybe not as important as collective emotion and alignment.

Maybe that’s what you mean by elite consensus. Personally I would not consider myself elite, but I do find the lack of specifics frustrating.


You begin your comment by addressing the failings of “technically sophisticated critics” but then you go on to list a number of legal and policy questions that have very little to do with technology. As a computer scientist I can’t tell you with precision exactly what legal and extra-legal mechanisms could be used to compel Apple; what I can tell you is that Apple’s technology does not prevent that compulsion. I can only assume you meant “technical critics” in some broader sense that doesn’t include technologists.

Regarding extra-legal compulsion, the only thing I can point out is that there is no US law on the books that requires Apple to deploy this on-device scanning system. But Apple is deploying it anyway. If you speak to anyone at Apple off the books they’ll tell you bluntly that they’re doing this to satisfy pressure from law enforcement and regulators (along these lines [0].) So the existence of powerful extra-legal compulsion is not debatable. The question is how far it might extend? Only Apple can answer that question, not “technical critics.”

[0] https://www.google.com/amp/s/mobile.reuters.com/article/amp/...


Yes by technical I meant people with detailed domain expertise including the law, not computer technologists specifically.

Can you clarify the nature of what you call extra-legal compulsion? (Edit to clarify: what you hear from folks in Apple) Is it the pressure to help stop the transaction of CSAM, a pressure that all image- and video-handling networked software providers experience? Or is it to deploy a client-side scanning service specifically?


It seems that a large part of why OnlyFans is destroying itself is an attempt to comply with some incredibly onerous requirements that MasterCard added in April: https://www.xbiz.com/news/258606/heres-what-the-new-masterca...

Especially “all content must be reviewed for child porn before publishing, or in real time if streaming”.

It now seems not at all implausible that Apple’s half-baked attempt to scan everything for child porn is due to this too.


I think you’re misunderstanding the ability for an echo chamber corporation to fuck itself thoroughly on that one.

They have serious problems admitting they did something stupid historically. Butterfly keyboards, reliability issues, you’re holding it wrong etc.


> I find exceedingly difficult to imagine that one of the most sophisticated companies in the world, with some of brightest minds out there, did not consider and calculate this precisely; that there is any way any of this has come as a surprise to Apple

After having read a lot of internal Apple emails between executives[0], I find it extremely easy to imagine that they were completely dumbfounded that the rest of the world did not see things the same way they did.

[0] https://twitter.com/TechEmails


I’ll note reports that Apple significantly increased iPhone manufacturing volume for the new phones coming out this fall. I hope it continues to be interesting to see what happens next.

There is an easy way for Apple to handle this. During iPhone setup present the following screen:

"We here at Apple do not want our servers to host images of exploited children, but we also respect your privacy. So you're free to use your phone with the photos stored locally, but if you'd like to enable iCloud we need you to press the button below to *install* our CSAM detector."

Then the argument basically goes away. It's like a virus scanner that you voluntarily installed. But having it baked into the phone as it ships rubs me the wrong way.


In my understanding, iCloud is not E2E encrypted. They hold the keys. They can scan content on ingress if they want.

But they don't want to. They want to keep the content encrypted on their infrastructure to reduce the risk of leaking it. They only decrypt it with a court order.

That makes sense but isn't introducing this capability to the device even worse? With this capability in place, the only thing preventing them from abusing it is policy.

In my suggested world, you are free not to install it. If you distrust them enough to think that they'll change their policy to be more invasive, don't install it.

Personally, I'd rather have the detection happen on my phone than in the data centre. And I don't mind opting in to have my photos analyzed before upload. But the thing I don't like about Apple's CSAM thing is that it comes with my phone. And you know someone somewhere is going to accidentally enable iCloud and then claim they never did. Or maybe they didn't? Bugs do happen after all.


Makes sense, thanks for clarifying.

That implies that the end goal or simply next step is not to scan all photos and the device whether or not you use icloud

It is not a goal.

And it was not a goal that the unsecured boulder at the top of the hill should come tumbling down over the road and into the house, it just so happens that when an unsecured boulder lies on the top of a large hill where teenagers gather and it doesn't stay there for too long.

Same goes for liberties and certain politicians.

For this reason we secure boulders, teach teens about consequences, keep invasive tech out of reach from wanna be authoritarian politicians and teach history in school.


> It's like a virus scanner that you voluntarily installed

Not really. Virus scanners don't snitch on users to governments.


That's what they want you to think.

I'm joking, but only halfway.


The problem is once you add this functionality, the cat's out of the bag and replies to requests to further invade users' privacy on-device change from "we can't" to "we won't", a position Apple can't possibly hope to maintain, esp. with the likes of China or the US government.

With that move I don't see how Apple will be able to refuse when a government asks it to scan for images of Whinnie the Pooh for example. They say they won't but they are too reliant on Foxconn's Chinese plants for manufacture, they could easily be blackmailed into compliance. (Android devices with a Chinese OS probably already report a wide array of stuff to the government)

This is another step towards total global surveillance of citizens. I don't see what can be done about it, technology makes it possible so it will happen, it is just too juicy for governments, they can't resist it.


You have it the wrong way around, Apple is FoxConn's cash cow. Apple is probably the only company in the world that has the resources to make its devices anywhere in the western world, it's just not as profitable or convenient as their current suppliers.

Well, they seem to be so reliant on cheap Chinese manufacturing that they're willing to be China's complete bitch.

https://www.nytimes.com/2021/05/17/technology/apple-china-ce...


How does this feature relate to scanning for Winnie the Pooh?

They already scan and classify ALL your photos locally, not just the ones destined for iCloud, that's a widely touted feature.


That is a lie - they only scan in your iCloud Photo Library. An external drive? Messages? Not scanned (except for the generic nudity detection system for child accounts that doesn't contact Apple about anything, a separate piece from the CSAM scanner).

I'm pretty sure that image classification (and now, OCR) works locally regardless of whether you have iCloud enabled.

But that's not the point, the point is that they have known for a while if you have photos of Winnie the Pooh, trees, lakes etc. That's why you can search your photo library.

If they wanted to report that to Xi Jinping, they could just do that. It's unrelated to the CSAM measure.


And China could have literally ordered this tool to be built at anytime in the last decade. It's not like China can't just order tools like this to be built anyway.

Apple didn't give them a new tool that China will abuse. China could just order the tool to be built and abuse it if they desired.


Yeah, but most of all this is an incredibly convoluted and inefficient way to spy for the government, and much easier for outsiders to detect.

China already owns and runs the servers used by iCloud in China, this changes nothing for them.

Because Xi Jinping looks like Winnie the Pooh and there's a controversial China meme surrounding this fact.

I’m aware, but they can just search for that directly, no need for CSAM.

Apple says their system is designed to prevent totalitarianism by only scanning images that appear in two or more countries' CSAM lists. So if one country sneaks in perfectly licit content, it won't scan for it.

Now, you might say that China would force Apple to scan for things anyway - and they might. But at the same time, China could have ordered this at any moment in the last decade. It's not like the Chinese government wouldn't force this tool to be made anyway.


The Chinese government already has full access to all the data of Apple's China user base.

iCloud in China is already run by a state owned company so at least in the case of China nothing will change.

https://money.cnn.com/2018/01/10/technology/apple-china-iclo...


Apple wouldn't be able to tell if it's Whinnie or a polical organizer, because they'll only be provided with hashes of the original images. Any government participating would be able to include any image it wants without Apple's immediate knowledge.

Why couldn't they do that before? If Chinese government officials said add content scanning or leave the country there would be no difference. The firewall was always fictive.

Agreed. I think the system would be superior if it sent the "naughty" hash-list to the client so that the user has some ability to recognize if the "naughty" list suddenly grows 10x in 1 country, and maybe even identify which material is on it.

The overarching trend I see recently is that your rights, as an individual citizen, simply do not matter. They will be brushed away by any agenda that has a semblance of globalist right-think.

Your right to privacy, in the face of government agencies executing their mission, does not matter.

Your right to free movement, in the face of a flu that overburdens the hospital system, does not matter.

Your right to free speech, in the face of the need to eliminate outsider politicians, does not matter.

Your right to election security, in the face of the establishment getting their preferred candidate, does not matter.

Your right to raise your children with traditional values, in the face of social engineering guidelines, does not matter.

Your right to bodily autonomy, in the face of globally coordinated medical interventionism, does not matter.

Your right to closed borders, in the face of foreign policy expediency, does not matter.

Your right to eat what you want, in the face of “climate change” activism, will not matter.

Your preferences are simply not safeguarded by your rights, which can be overruled by the whim of “experts.” If you want to imagine how any particular future scenario unfolds, just ask yourself whether your rights would be an inconvenience to the plans of, say, Bill Gates. As a sort of stand-in for the general careerpol/NGO/billionaire/Harvard class running things.

(and most of the media and many of the public intelligentsia merrily support this)


The world is full of trade-offs, no rights are absolute.

This argument has so much less credence by being full of references to enthusiastic ignorance. Covid was a massive IQ test, which many failed by following con men who told them to take the easy path, rather than the path of self responsibility. Rather than insisting that doubling down on bad decisions is somehow defending your "rights", you need to come to terms with how you were led so far astray in the first place.

Furthermore, "right to closed borders" ? You previously invoked "free movement" yet there is also a right to closed borders? Mmm-hmph. It really appears you've just cloaked the same tired red team talking points in the language of freedom. Please, as a libertarian, stop trying to use freedom to justify what is a highly authoritarian movement. You're doing freedom no favors.


Yes this is generally what being a citizen means, you can move about freely within your nation’s territory, and the territory is protected. Lack of territorial integrity is contrary to the principle of citizenship. Otherwise you just become something else - subjects of the emperor perhaps.

> Furthermore, "right to closed borders" ? You previously invoked "free movement" yet there is also a right to closed borders?

These are not in conflict. Freedom of movement WITHIN YOUR COUNTRY is not the same thing as open borders, which will dilute and destroy a culture.


> Covid was a massive IQ test

Compliance with the COVID response is more a indicator of trust in public institutions than it is an IQ test. You could argue noticing the untrustworthy behaviors and picking upon on that trend is an indication of learning quickly...


Mask wearing in particular requires no trust in public institutions, just basic reasoning about airborne particles. If that's too technical for some, it has also been shown in every single movie about pandemics.

The sheer number of people still rejecting this straightforward mitigation indicates that their bad decisions have little to do with distrusting institutions. Rather the problem is their trusting malevolent leaders who have been misleading them.

As for "closed borders", I've never seen this referenced as a natural "right". Yet here it is dressed up as one, even though it is ultimately a collectivist action rather than an individual ability.

You can of course still argue the benefits of closed borders, and five years ago I would have been sympathetic (the enthusiastic ignorance movement has since burned my assumption of their good faith). It's just specious to call it a "right".


Masks aren't very useful at stopping viral spread and have traditionally been limited to mitigating bacterial shedding in environments where that was key, such as surgery.

Since time immemorial, a fundamental right has been warding off invaders of varying forms, armed or not. Exclusion of uninvited parties from a piece of land is one of the oldest things in Western civ.


This is gish gallop.

Mixing together a bunch of weak arguments does not create a strong argument.


>This is gish gallop.

I thought this was some auto correct perversion of codswallop but it's a thing:

https://en.wikipedia.org/wiki/Gish_gallop

TIL


> trend

I'll just wait for the Chinese government to ask Apple to implement CSAM for Tank Man pics.

Apple swears they'll say no...very reassuring and no way this gets abused. /s

Oh enough already. If you are uploading 30 or more images that would be illegal in your country to iCloud you're not using your head.

The problem is that the definition of what is "illegal" can shift very quickly, esp. in certain countries, so moving the ability to scan for "illegal" stuff to user's devices is absolutely disastrous privacy/freedom-wise.

And MY point is that if you're really concerned with that you never should have been using iCloud sync in the first place. You were already exposed to that risk.

But the CSAM technology won't be limited to photos uploaded to iCloud. That's where it starts, not where it ends. This technology exposes all users of iOS, past and present, to the whims of the current definition of "legal".

So your argument is that I should be concerned because they may change the feature to do something different than it does now. But you could always say that. They could make it do anything in the future.

True, at some point you have to trust someone, whether it's your phone's manufacturer, your telco, or the developers of the apps you use. But when there's a flagrant disregard for users and the potential impact a system like CSAM could have on them, to me that crosses a line and means the company is no longer trustworthy:

> If a company actively screws its users in broad daylight, then what's going on behind closed doors?

At least previously Apple had the pastiche of a privacy and user-centric company. No more if this goes through.


You seem to be concerned about actual CSAM "users" getting caught. What is being argued is that this creates a scenario where authoritarian governments can slip embarrassing photos into the list and find where they originated, find memes, etc.

I understand those arguments but find them unconvincing. These authoritarian governments could also just insist on seeing the cloud data and do not need to bother with such a roundabout way of getting it.

Whose to say drawn images won't get flagged? I mean a particular genre that exists but in all essence...it's not illegal not real. Creepy, weird, and disturbing, but not nearly as bad as the former.

If such drawn images are not legal where you live, I would suggest that if you must look at them you should not sync them to iCloud.

You (and Apple) have no way of knowing if the photos that will flag you are actually illegal content. The repository is a black box.

Once the photos are flagged they're subject to manual review.

I was always using iPhones because I felt like Apple tries a bit harder to protect my data than Google with Android.

But for the future I probably need Android phones with custom ROMs. Without spyware.


Yeah, I've alerted family to this and they're asking if they're going to need to move off of iOS. I told them to wait and see for now...fingers crossed that this gets walked back.

Look into CalyxOS. I am also about to jump ship.

Thanks for the tip, I'm still going to wait a bit, let's see how it is going to turn out. For now Apples plans are not that bad, but that can quickly change once Pandora's box is opened.

I've turned off auto-updates and I'm reconsidering my decade long allegiance to this brand. I love apple, I love the look and feel, but I could contribute to Linux to make it better. This is the final straw.

Me too. I have ~20 members of my immediate and extended family that are iOS users. Since 2009, I've evangelized Apple devices to them, but if Apple pushes this through, I'm done. I'll find either a de-Google Android or feature phone to use and start encouraging them move off of iOS.

Remember Mac became cool 15 - 20 or so years ago?

I'm at least old enough to remember it and I have a very strong feeling that the same is happening these days with Linux on the desktop.

Also on phones the alternatives are shaping up nicely so do talk well about them, don't say people should swith now but say you are considering a switch next time you upgrade etc.


Dang I hope Linux is getting cool :D Recent releases of things like Gnome and ElementaryOS are pretty close to mainstream-ready IMO, so will be interesting to see things play out over the next few years.

> Also on phones the alternatives are shaping up nicely so do talk well about them, don't say people should swith now but say you are considering a switch next time you upgrade etc.

This is excellent advice, thank you!


The letter this article is about was discussed several times already:

Policy groups ask Apple to drop plans to inspect messages, scan for abuse images - https://news.ycombinator.com/item?id=28230248 - 284 points, 1 day ago, 190 comments

Policy Groups Urge Apple to Abandon Building Surveillance Capabilities - https://news.ycombinator.com/item?id=28232068 - 92 points, 1 day ago, 25 comments

Policy groups ask Apple to drop plans to inspect iMessages scan for abuse images - https://news.ycombinator.com/item?id=28231094 - 33 points, 1 day ago, 3 comments


Legal | privacy