I wonder how such a feature could even distinguish legitimate use. What if I want to take pic pictures of a strange vaginal discharge to send to my child's pediatrician? Am I now on a list? What if I want family photos of my happy naked baby playing?
What feature are you talking about? The iCloud photo scans for known photos, it doesn't detect random images. The iMessage feature only allows parents to see images that their children are sharing.
Really? Can you point me to a better description than in the article?
>The planned features include scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos
This description seems to cover scanning original content. Database comparison seems to be one tool of many, and insufficient to meet the core functionality of stopping OC.
Edit: after looking around, The CSAM hash compare is one of two tools. The other is using on-device machine learning to analyze message image attachments and determine if a photo is sexually explicit
Yup, I think you found it. The on-device one only applies to people under 13, and only reports to parents. And, IIRC, lets the child know that it'll be reported to parents before the child views it.
reply