By this line of thinking… if a computer service tech finds and shares compromising pictures of you or your loved ones with their friends it’s all good because nothing bad happens to the customer.
Then the myriad other tech companies that scan the images uploaded to their platforms will be alerted of supposed illicit images being uploaded. This isn't new. No one's lives have been ruined by this and it's been practice for years now.
I think the headline is kinda missing the point here; the issue is not the flagging of illegal images once found, the issue is snooping around people's computers that were left for repairs deliberately looking for illegal images and the violation of trust there.
My computer doesn't have anything illegal on it, that doesn't mean I want a Geek Squad employee looking around at what is there. And if they know there's a payday from the FBI if they find something, you bet they are going to look around. Then what happens when they find something not illegal but that they like; pictures of my wife or friends and family at the beach or something.
There was a case of an old man who had pictures of his grandchildren on his computer. After a long and confusing investigation, trial and so forth, they finally showed the photos to a wildly confused judge.
It turns out that a handful of the photos were the kids taking a bath or running through the sprinkler. Charges were dropped but still.
Suppose you were a pediatrics student with various photos of children on your computer, or maybe you're a gymnastics teacher with a load of photos of your students.
If you have an overreactive computer person browsing all your photos, it may not matter if you have nothing to hide.
I'm sure people who get caught in these sorts of messes abandon Geek Squad and Best Buy.
What you’re missing is that now they have an incentive to go looking for imagery when they should simply be trying to fix your computer. And maybe even an incentive to plant evidence.
Moreover, I don’t see any reason Geek Squad would need to be in my images folder in the course of working on my computer, and I’m sure that, child porn or not, most people probably wouldn’t want their private collection of pictures being sifted through without permission.
It's not just your data though. They probably have your photos, and they share those with people like the NSA, who show your nude photos around the office.
That's way worse IMO, but doesn't seem to illicit the same response as this guy's art.
> isn't much different from the photo processors of old finding it
You hand your photos over to be processed, but unless you had some photo specific issues, a computer tech has no business accessing photographs on your computer. It is an unexpected invasion of your privacy for them to do so, even if though they have the access to do so.
It's like hiring someone to fix your kitchen sink and then stepping out for a bit and finding them going through your bedroom drawers.
More importantly, you can detect this and filter out those images pretty easily - which means you have implicit consent from the users of all other images because they chose not to protect their images technically.
I do it also, because I don't trust them to not save the images, and I suspect the images will either get hacked by or flat-out sold to a third party later. The pat-down doesn't make me feel like I'm being raped, although I guess I can see why others would be uncomfortable.
I'm sorry. The chance of someone needing to look at these pictures should be near 0. The backup company doesn't need to look at them. They should actually be end-to-end encrypted so that only me and grandma can see.
In exceptional cases the police may need to see them. But even in that case there is nothing wrong with these pictures. Pictures of gore and literal shits are legal and while I agree that it can make this job very taxing. But a child swimming naked is not the problem here.
Thats the point. None of the 90% normal pictures will do any psychological damage, the 10% illegal pictures will but those will have to be looked at anyway.
The advantage of an algo detecting illegal content isn't in protecting civil servants, it's scanning everyone without them.
On the one hand, "If you don't want naked pictures of yourself in the cloud, don't take naked pictures of yourself and put them in the cloud."
But this is like saying "If you don't want to get scammed, then don't respond to scammy emails." That is, it's perfectly good advice, which is fine for people who visit Hacker News, but maybe not sufficient for the vast majority of people who aren't aware of the ins and outs of our rapidly advancing technology.
There are whole communities of people devoted to the practice of finding women who accidentally configured their phones to upload all pictures to a publicly accessible cloud storage server. The women whose nudes are distributed this way may not realize their pics are being mirrored- or they may assume it's to a private site (because why the hell isn't that the default?!?)- or they may have shared these pics with a dude who made the same mistakes.
But regardless, the point remains- any individual is easily capable of being immune to this problem. But there's a whole population of vulnerable victims who don't even know they're being victimized. And that is a real problem.
It's not as though it's impossible to make a secure image. Mistakes can and do happen, though. Better to own up to it and fix it than to hide the problem.
Pictures don't hurt anyone (except sometimes in the libel/slander sense).
But don't stolen credit cards get traded online? Or other impersonation / "identity theft" information? Maybe a marketplace for one of those would be a good example.
.
The trick is probably making sure that "something that we all agree is wrong" matches with things that actually harm people who didn't freely agree to it. Things like murder, coercion, kidnapping, breach of contract, ...
Considering how frequently my innocent images (petting my cat, taking a photo of a pepper growing in my garden, pointing to a word printed on a page in a book) trip Discord's automatic-bad-image-blocker and are unable to be uploaded to Discord because they think they are bad, I'm a little worried that when they turn this technology on, instantly many 100% innocent photos people have taken will be quietly flagged and put people on lists without them ever knowing it, and as they go through life this non-thing that they were never charged with (to be able to be found innocent) will follow them and haunt them and impact other business relationships, etc. All because of pictures a computer thought was something else.
reply