Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

I wonder how such a feature could even distinguish legitimate use. What if I want to take pic pictures of a strange vaginal discharge to send to my child's pediatrician? Am I now on a list? What if I want family photos of my happy naked baby playing?


view as:

What feature are you talking about? The iCloud photo scans for known photos, it doesn't detect random images. The iMessage feature only allows parents to see images that their children are sharing.

As you described, nothing would flag that at all.


Really? Can you point me to a better description than in the article?

>The planned features include scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos

This description seems to cover scanning original content. Database comparison seems to be one tool of many, and insufficient to meet the core functionality of stopping OC.

https://www.google.com/amp/s/www.nytimes.com/2021/09/03/busi...

Edit: after looking around, The CSAM hash compare is one of two tools. The other is using on-device machine learning to analyze message image attachments and determine if a photo is sexually explicit

https://www.apple.com/child-safety/


Yup, I think you found it. The on-device one only applies to people under 13, and only reports to parents. And, IIRC, lets the child know that it'll be reported to parents before the child views it.

https://www.macrumors.com/2021/08/05/apple-communication-saf...


This comment is proof that the discourse surrounding this is so badly misunderstood that Apple couldn’t possibly hope to fight it. ;P

Legal | privacy