Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Here's one: identifying child soldiers.

[i] https://www.pyimagesearch.com/2020/05/11/an-ethical-applicat...



sort by: page size:

Perhaps a corollary to this: instead of trying to match children’s faces, the surveillance state can mine other parts of the image; Open Source intelligence. Language and sound cues in a video. Or, my personal favorite, detecting the age and manufacturing origin of floor tiles and wall trim from a photo.

We do: Automatic detection of child pornography using color visual words - http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=60...

Automatic detection of child pornography - http://ro.ecu.edu.au/adf/60/


Taking your example of image analysis, companies in this space will build applications that - for instance - identify which images of the 10,000 on a bandit's computer are all the same kid. The software company would use sample images of clothed kids to demonstrate how it works, then only law enforcement would input confiscated CP images to do the analysis.

Are there any projects working to automate the identification of offensive material with Machine Vision or Machine Learning? Replacing these peoples' jobs with computers might actually be a humane act.

They also use image recognition to detect nudity and previously classified child porn.

I like this type of application for this technology. A good tightrope rabbit hole is: what if "the unknown soldier" is identified writ large?

Personally, I would love to have some folks in Ye Olde Family Photographs identified.


> Example: How does a researcher test whether algorithmically-classified illegal imagery stored on user devices is being scanned and reported home to Apple's servers, and what those bounds of AI-classified criminality are? (presumably with respect to what is illegal in the user's jurisdiction)

I'm not an expert in AI so this might be totally off base but I feel like you would be able to use an "intersection" of sorts for this type of detection. You detect children and pornography, the children portion trains it for age recognition and the porn portion trains it to see sexual acts. Slap those two together and you've got CSAM detection.


I think there's sort of a sub-field within image recognition which just focuses on the techniques to automatically detect nudity.

Can you imagine having something on your resume like: "Awarded patent for the 1st algorithm to reliably detect penis."


That's a good point I hadn't heard before, perhaps it would be relatively straight forward to build tools that detect this kind of thing? E.g., as per the article, an image/video tagged as being in 'Syria' later being tagged in 'Gaza' could be flagged up for human review.

Next, something that scans for pictures of cops, to detect people taking pictures of law enforcement.

It only filters images already there - child porn isn't already among the chaff only thing it can find by definition would be false positives. Which could be an interesting vetting exercise in itself - feed in tons of cat pictures until it dings on one as inappropriate.

Technically you can always try cycling through image combinations match the hash/looks like child pornography and nothing could technically prevent it except that it would take a very, very, long time.

Even given adversarial networks I would be very surprised to get anything more than vague figures and I don't even expect the result to look human. In which case it is really generating instead of finding.


I'm actually aware of systems which were built to identify porn through neural-network training. I don't know if these were sensitive to distinguish child porn from other, but as a method of finding images likely to contain more skin than desired, they worked fairly reliably.

That said, the Google/Microsoft tool does seem to work based on a known image corpus.


I'd be more interested in whether you could take an image of, say, an underage dog, and tweak a vaguely similar image of a grown dog so that it was a match. That's what will get innocent people thrown in jail.

I really hope they're also using this technology in an actually useful way, e.g. attached to a web spider, searching for underage porn pictures.

Those could be a handy tool for law enforcement the image classification model could be trained to spot criminals? I'm pretty sure there are mugshots from say a sex offender database, then the police just need to take those binoculars to a park… it would be like shooting fish and a barrel!

Just a random thought, you could ask the FBI to borrow the content they seize to train your computer vision systems hahaha. How can you train something to know what CP is without it. Look for this blob of skin, smaller than this blob of skin, where these smaller blobs touch.

Which is a completely different, and rather trivial, problem compared to "automatically detect the age of people in a video and/or if they are doing something sexual".

The first is a mere process of comparison, while the latter requires quite a few levels of subjective abstraction.


That's a really great idea!

The other thing I was thinking about was to include the images in an app for young children. And they need to tell which items they are seeing on the screen. But I think it only works for images which are already verified with a high probability (so that we can make sure that they are only seeing appropriate content)


I would love to see what would come out of a network trained to recognize pornographic images using this technique. :)
next

Legal | privacy