> My only hope is that this extreme enshittification of online images will make people completely lose trust in anything they see online, to the point where we actually start spending time outside again.
Well in a weird way it will provide cover. You could post nudes of yourself online and just explain it away as bad actors using AI.
> My only hope is that this extreme enshittification of online images will make people completely lose trust in anything they see online, to the point where we actually start spending time outside again.
I'm in full-bore accelerationist agreement with this point. Defense lawyers must love this.
> it would be so easy to destroy someone's career or relationship by making these deepfakes and then spreading them around.
If nude pics can get you fired, work culture needs to change.
Same with relationships.
Basically, people need to learn not to trust digital media at all without some kind of authentication, and to be a little more tolerant of nude human bodies when they do pop up.
> It is inevitable that basically everyone in the next generation will have deepfake porn made of them by some asshat classmate. It is about as hard to stop as someone creating an nude painting of someone from imagination.
This is just casually dropped as though a photo-realistic AI image == an amateur painting, in production quality, in required effort and time, as though they're even remotely comparable. Which by the by, I also think the amateur painting should be punished as making nude pictures of people you don't get permission from is creepy.
> It is not a nice thing to do, and really mean if shared in public, but the types of measures it would take to stop it are incompatible with a free society.
I do not wish to live in a free society then. This is patently absurd. We also cannot stop people murdering each other, stealing from each other, or otherwise causing harm, but we damn well punish it when it happens (when our useless judiciary system can be bothered anyway but that's a different conversation) and that people are just like "well I guess kids now have to worry about classmates making photo realistic depictions of them performing sex acts" is so horrifying. So horrifying, on so many levels.
Is this what we're building? Is this the society we all envisioned as kiddos getting into the Internet and our computers? I think we can do so very, very much better than this and it makes me incredibly depressed seeing what we're permitting to be done with the tools we make. Everyone involved in this should be ashamed. I don't know how I'd live with myself knowing I played even a small role in ushering in entire new vectors of sexual harassment directed at people who already struggle to get a modicum of justice in this world.
And for what? "Freedom?" My right to swing my fist ends when it contacts your face, but up until then, you can't do a thing? You just get to have dozens if not hundreds of people swinging at you faking you out every minute of your life and you just have to sit there and take it, because they technically haven't hit you yet? Bollocks. Absolute nonsense. This society might be "free" but it cannot function and the longer we delay in gaining this understanding the more people will be driven out of their minds waiting for us to do so.
>surely trust in the veracity of damaging images will drop to about 0
Maybe, eventually. But we don't know how long it will take (or if it will happen at all). And the time until then will be a nightmare for every single woman out there who has any sort of profile picture on any website. Just look at how celebrity deepfakes got reddit into trouble even though their generation was vastly more complex and you could still clearly tell that the videos were fake. Now imagine everyone can suddenly post undetectable nude selfies of your girlfriend on nsfw subreddits. Even if people eventually catch on, that first shock will be unavoidable.
> you can fully expect your asshole friends to grab a dozen photos of you from Facebook and then make a hyperrealistic pornographic image of you with a gorilla
my prediction is that, as a result, people will start assuming pics online are fake until proven otherwise.
> it inpaints a body that, while seamless and realistic, looks nothing like her own.
Well, that bug is going to be fixed pretty soon, as AI learns about different body types and how to identify them from clothed images. The bar is pretty low, because the AI-generated image only needs to convince people who haven't actually seen you nude.
Sooner or later, the only way to tell whether a nude image is real or not will be to check if if it accurately reproduces hard-to-guess imperfections in locations that nobody other than your most intimate partner will ever see. Brb, gotta give the kids a cryptographically unique tattoo or something.
> If I want a model to generate a fun pic of me at the beach and instead it makes a nude photo, that's completely different
What if someone else's idea of a fun pic at the beach is a nude photo? Is it only censorship when the worldview being enforced is not aligned with yours? A truly uncensored model should accommodate everyone, but society will (and should) absolutely burn down any AI service that accidentally generates nudes from a beach photo of a family that has minors. "Intent" is a human attribute, AI is probabilistic.
This abundance of caution by the companies avoiding nudity is unfortunately read as promoting a worldview or pushing an agenda, while it's just risk mitigation.
> seriously this is the thing you fear? fake porn?
I'm being polite. Things will be so much worse.
Someone will make child pornography using your child's face as the input. Someone is going to take private videos of politicians and then edit them to have them say incriminating things. Someone is going to short the stock of a large company, then release a faked video of the CEO being shot, and profit from the immediate stock plunge.
And this is just what my mind can come up with. Imagine what 4chan will invent.
This comment reminds me of a disturbing one I read a few months back on an anonymous underwater basket weaving forum.
The poster was discussing those expensive “real doll” (?) sex robots that creep me out, anyways he went on to mention that he discovered the app “face app” could take these things from “uncanny valley” to essentially photorealistic. Makes sense given what the app is supposed to do, but the real mind fuck was he said it also works on under aged dolls… which again I was surprised to learn are apparently legal most places.
So AI/ML can take an inanimate object and then create highly illegal fool-most-humans level fake image from said inanimate object. Seems pretty scary to me, and we might not be far from an ML model that can essentially produce endless amounts of illegal content.
What are the implications of that? How do you even begin to police/filter that? Then I guess there’s the question of if we even should, since none of it’s real anyways.
These arguments aren't mine. You're attempting to colour my words by associating them with someone else's who they apparently "sound like."
> Real images are more concerning because it means real people were harmed in the real world.
Yes, they are. I hope real images of abuse are treated with the gravity they deserve.
> Those same fake images could be generated with Photoshop. AI just makes it easier.
Yes, many of the things we achieve with me technology could previously have been achieved with more effort using earlier technologies. Airplanes are still world changing, though, even thick we already had cars.
> Banning AI doesn't solve the problem.
This is a strawman.
> Additionally, you implied that I am exhibiting some sort of Hacker News group think, I am not.
I simply meant that HN is a predominantly male space and that men are less likely to be victims of faked pornography, so the discussion here will inevitably take a more detached and theoretical point of view on these matters than if potential victims were more included in the conversion.
>There is a clear risk from these sorts of models as they get better - I mean recreating specific individuals’ likenesses in compromising images
And the risk behind that is...?
If you drill down with such claims the core is always "someone might use this to lie online" and the proposed solution every single time is: more surveillance. End anonymity. Have a Facebook account required to use the internet. Real name and real face policies for every online interaction.
> Presumably both of you will now go ahead and post naked pictures of yourself since you don't care if anyone sees them?
Don't get your hopes up. Everyone has a lot to say on this topic and why 'Our culture is bad. Everyone should grow up - having your nudes online is not a big deal', but they'll never deliver themselves.
> This another example why nudes and so forth should never make it onto a digital device. If that's your kink, then Polaroid cameras and camcorders exist.
This might be the only surefire way now, but it's not how it should be. How it should be is that a HIIPA-like wrath of god may descend on you if it turns out you mishandle someone's data like this.
Well in a weird way it will provide cover. You could post nudes of yourself online and just explain it away as bad actors using AI.
reply