Whether or not it’s immoral is opinion-based. It’s probably typically in order to not alienate those who think it is. Effectively a business decision even in the context of an open model.
Because puritans, extremists, and generally everyone who believes themselves morally superior to others, believe they have the moral right, if not imperative, to impose their views onto others. And to get said others out of their wayward paths.
Anything less would mean letting black sheep harm and corrupt society as a whole, durably. How could you let that happen?
Their pure and morally superior ends thus justify the means, coercion being the least intrusive and oppressive of those, and paling in comparison to other acceptable methods, such as public shaming, ostracising, and even violence.
Basically, to self-righteous zealots, freedom and individuality are secondary to what they see as morally, and universally, right.
And how could it be otherwise? You can’t possibly be convinced you know the one and only acceptable way for all and accept people should be free to do as they want, can you?
That, and some have always liked to weaponise these sentiments for influence, political power, and monetary gain.
LLMs don’t support this sort of functionality, and thus the trainers eliminate adult content from the training so as not to lose market share to other censored models. As stated in other comments: users can fine tune the model to incorporate adult content if it is necessary for their use case.
You should take a look at the HN guidelines. Your comment is a strongly worded take on politics, religion, and otherwise significantly controversial topics. Ideological battle is discouraged.
> LLMs don’t support this sort of functionality, and thus the trainers eliminate adult content from the training so as not to lose market share to other censored models.
And yet Stability.ai had managed to do just that with Stable Diffusion, even if it’s wasn’t in the model per se, and was quickly worked around anyway.
> You should take a look at the HN guidelines. Your comment is a strongly worded take on politics, religion, and otherwise significantly controversial topics. Ideological battle is discouraged.
Tomato, tomato.
One’s observations and commentary on the methods of those inclined to wage ideological battles, including censorship, is another’s strongly worded ideological crusade.
I have been thinking that most cultures before roughly the 18th century were not puritan at all, at least in some shape or form. For instance I have been reading muslim poets who made fairly erotic poems, same for India. Now America is an outlier here, it’s actually the place were puritans went.
It was only in the age of full on colonization that you see the puritan groups forming (Salafists for instance rise in the 19th century) and it’s a similar mechanism at work in India.
Not sure the background on that, would love to have more insight how the history of morals developed. Maybe at some point it became a tool to keep people in check?
Perhaps. The factors and morals seem to vary significantly, depending on both historical periods and cultural contexts.
Take, for example, Japan during the Meiji restoration [1]. In their quest to dispel behaviours deemed immoral or indecent by Western societies, they largely eliminated mixed onsen ("konyokuburo"), in an attempt to transform into a "modern", "civilised" nation, worthy of international respect rather than colonisation.
Medieval Europe is also interesting. Despite the church's portrayal of sexuality as sinful and degrading, many nobles maintained mistresses, and even certain popes (Alexander Borgia comes to mind) fathered children out of wedlock.
Finally, both ancient Romans [2] and Greeks [3] held perspectives we would find surprising today, if not ambivalent.
I still my surprise when noticing a dozen boxes of very explicitly shaped cakes in a remote Japanese mountain trail souvenir shop. I guess we could call that culture shock.
It's funny to read this comment in the context of users who are angry they cannot impose their pro-pornography views on the creators of the model. How dare they obey different morals! How dare they have different views! My views are more important and thus they should do what I want!
>Their pure and morally superior ends thus justify the means, coercion being the least intrusive and oppressive of those, and paling in comparison to other acceptable methods, such as public shaming, ostracising, and even violence.
You yourself hit about 80% of your "ends justifications" in your shameless attacks on folks who don't want pornography coming out of the LLM they're using at work. The irony is unreal.
Oh, people who want everybody to always be able to publish anything, anywhere, anyone else’s preference, sensitivity, and beliefs, be damned, definitely are just as zealous as the ones they are hell-bent on fighting.
It's an open model so if you want, for a fraction of the training price, you can fine tune it with adult fan fiction to generate even more adult fan fiction if you really want to.
An open source model allows for that. Compare this to ChatGPT/GPT-4 which are closed and filtered at the API level.
I take issue with silicon valley deciding the global moral guidelines based on their arguably narrow view and came here to defend you, but after seeing that artwork I gotta say isn’t that porn? I mean of course it’s up for debate, but just because it’s old and hang up in museum does not make it non-porn.
Maybe the problem is more facebook banning all porn including the culturally relevant one.
That would be hard to say. I recall a US Supreme Justice basically defining "hard-core pornography" as "I know it when I see it".
Some would say that pornography implies sex.
Others that it is defined by the content’s purpose.
Some would contend that considering the mere exposition of the female body outside of any sexual interaction as pornographic is, in itself, objectification and sexualisation.
I don’t really have an opinion.
And, in the case of this specific painting, to me it is merely an amusing, well-thought, intriguing, multi-layered play on both words and symbols, that isn’t really significantly any more graphic than the anatomy book for children I had at 6 [1].
But that’s just me. Some could argue they aren’t even remotely the same.
If you’d prefer a less divisive example, Instagram also banned Almodovar’s nipple film poster [2].
> Maybe the problem is more facebook banning all porn including the culturally relevant one.
Or Facebook being left to decide what is and what isn’t porn, and applying their one and only rule to the whole world as if one size could fit all.
"Some would say that pornography implies sex. Others that it is defined by the content’s purpose. Some would contend that considering the mere exposition of the female body outside of any sexual interaction as pornographic is, in itself, objectification and sexualisation."
It definitely varies from culture to culture. In the US a bared nipple on national television was a sensation, in the UK it's par for the course.
In some cultures women regularly go bare chested, in the US being topless on a beach could land you in jail.
In some cultures men wear nothing but a penis-sheath in public, while that would be considered outrageous in many other cultures.
In the US it used to be considered indecent if a woman showed an ankle, and uncivilized not to wear a hat.
In India hugging or kissing in public might get you assaulted[1], while it's no big deal in many other countries.
There are taboos in every culture, but what is taboo varies from one culture to another.
Yes, I totally agree with you and the Almadovar poster is another good example. The earlier artwork just brought to my mind that maybe porn is age old thing and it’s interesting that if it survives for long enough it gets this aura of acceptability.
Either way I think social media platforms should tune their filters to each society they operate in.
Pompei is remarkable in that aspect. Not only for the porn [1] preserved by the eruption, but also for the (some explicit) graffiti [2] [3] [4] [5]. It seems their perception and use too have evolved over time. And that "x was here" truly is timeless.
It's important for general-purpose use: with generative models there's always a chance of hallucinations. For all uses except the specific adult-flavoured ones, you don't want the response to contain vulgarities. No one wants their company's chatbot to start narrating furry erotica. If trained on adult content, you would need to have a burdensome moderation layer downstream of the LLM.
When you do want the more niche adult themed LLMs, there are fine-tuning datasets available. Fine-tuning a vanilla open-source LLM for these uses works great. There are active communities of adult roleplay LLMs on imageboards.
Haven't we seen that too much reinforcement to censor worsens the model, just like ignoring some data actually makes it worse in all other parts?
Even though this is quite bizarre on its surface - that ignoring for example works of fiction makes it worse at programming.
In that case a simple middle man agent that is inaccessible to the user would provide better quality while maintaining censorship that can even be dynamically and quickly redefined or extended.
"Why are tech companies so puritanical? Adult content is not immoral."
It's all about "optics" and PR. These companies don't want their brands associated with porn. That's why YouTube doesn't allow porn on their site, even though it would be enormously profitable.
Unless you have visibility into a niche corner of youtube that nobody else does...
Is there content on YT that could possibly be called porn? Sure. Is there actual porn the way most people understand it? Not more than a vanishingly small amount that would be next to impossible to find without getting directly linked to it.
YT doesn't even allow gratuitous posing, especially lingering rear shots, for try-on hauls. Sex talk (no explicit activity) is allowed, but put behind their "inappropriate" warning. I haven't seen anything that isn't at least dual-use educational or ASMR. I haven't seen anything that's particularly vulgar, even if it didn't descend into problematic fetishes. Plenty of YT channels were banned for suggestive sexual ASMR, even without nudity or explicit roleplay. They don't have as much problem with scantily clad, suggestive dancing in mainstream-approved media content though, but that's not porn either.
The most nudity I've seen allowed on YT was in fringe dance performances and very occasionally in mainstream music videos.
Does that include LGBTQ content? If yes, then this probably go under the definition of intolerance and in California there are protections. YouTube should not be allowed to filter the content like that, it must have it readily available for all audiences.
It has nothing to do with sexual orientation or claimed gender, and everything to do with what body parts are shown and what explicit words (or other sounds) are used.
And you know this because you’re the expert of knowing what is morality? Decades of people telling you it is immoral, yet somehow you come along to say the opposite and hope people believe it? The only justification you can give is an ad hominem attack accusing the other person of not knowing what they are talking about.
I did not see that bit. I came upon this many years ago. TED has changed its thought process and stances on multiple subjects since then. Adding such disclaimers is a new turn of editorialism.
If you want to go down the rabbit hole of research that both recognizes and refutes the assertions, you will find more opinions expressed than facts. But neither the facts nor opinions are interesting. It is the narratives and the lessons derived that hold more value. And that video expresses some of them.
There is a longer documentary style video on the same subject by the same speaker.
That whole website is "all of science disagrees with us, but we know we are right, and there is a cabal of scientists trying to silence us". This is the least scientific website and you should not use them as a source.
Although not evil, adult content should be opt-in, and should be able to be opted-out at a platform level... hence, the need for censored models. Imagine a restaurant booking AI app, built on GPT, that accidentally doubled as a bomb-making tutor or an adult content generator. It's a lawsuit waiting to happen, if nothing else, and it's worth making these use cases harder (if not impossible) to implement in mainstream, commercially available products. Note that for many of these products, the age and consent for adult material has not been already established.
So far, the open source ecosystem seems to be doing a good job of providing both censored and uncensored LLMs - and it seems there are valid use cases for both.
Think of this as similar to Falcon LLM being launched in both 40B and smaller 7B variants - the LLM often will need to match the use case, and the 7B model is a good example of making the model smaller (and worse) on purpose in order to reach certain trade-offs.
Why are tech companies so puritanical? Adult content is not immoral.
reply