Reading down into the comments is quite disturbing. The official stand on underage posts (which are not obviously cp) seems to amount to, “If it’s a picture of you, report it.”
Maybe they are just trying to reduce their liability. Once they start actively policing content that isn't flagged, they will be held responsible whenever something gets on the site that shouldn't. If their policy is to only review content that was reported, it's much easier from their end, and they don't have to start making as many decisions about what is OK and what isn't, like sites like Twitter have to.
You’re almost certainly correct, yet I think this shortcut many content providers/hosts have taken is going to eventually lead to backlash. I really don’t want to see what the FCC looks like trying to police large sites for content.
Well, reddit has been policing content for a very long time. So far they haven't been held responsible for things that "get on the site but shouldn't".
Admins are involved in cases of legal issues such as doxxing, threats, illegal material, etc. Mods are responsible for the rules of their sub, site-wide rules ultimately are the purview of admins.
Honest question, what do you expect them to do? It's not like you can tell from the image the age of subject and, if they are of age, whether they consented to having their picture posted. Hell, outside of celebrities you'll be lucky to identify the subject at all.
One thing that's left out is that anyone can report the picture and mods are incentivized to police their posts somewhat.
> Generally the mods of the_donald have been cooperative when we approach them with systematic abuses. Typically we ban entire communities only when the mods are uncooperative or the entire premise of the community is in violation of our policies. In the past we have removed mods of the_donald that refuse to work with us.
> I don't accept that excuse. These were gathered by searching for phrases that should be included in their AutoModerator config. These calls for violence are mod enabled.
Clearly the CEO of reddit feels the mods of r/the_donald have done a reasonable job at policing their sub overall, despite having missed a handful of extremely lowly upvoted comments.
Do you suggest the AutoModerator filter for words like "hang", "kill", "shoot"? Why not implement that site wide? I'm sure that would go over well with the rest of reddit.
What pisses people off is how spez (reddit's CEO) is dancing around the fact that for a reason or another he can't/doesn't just punish a misbehaving subreddit while other borderline subreddits otherwise very cooperative but way smaller get shut down regularily.
It's a bit like how kids in a class get away with sticking gums two or three times on their desks "because they were quick to remove it" when other kids have their parents called the first time they bring a circuit board to school.
Reddit does what it wants, that's their platform. It just looks terrible on the face of it.
Agreed, reddit's ban hammer is often arbitrary and self-serving and r/the_donald drives too much traffic to reddit for them to seriously consider it for them (even more so with Twitter and @realDonaldTrump)
Admins should contemplate banning less, and users on all sides should chill on demanding it for subs they don't like. No one forces you to go to a sub, and you can even filter them from the front page without extensions now.
But live-and-let-live is not particularly in-vogue these days. As in this submission, the top post is (nothing to do with topic) "Why isn't this other sub also banned hmmm???"
> and r/the_donald drives too much traffic to reddit for them to seriously consider it for them
Ehh, it's not that, the sub is not really that active outside of a core group. Their user base isn't that involved either, none of the polls they post get any traction (that's why they stopped posting them) and for the most part frontpage posts get like 60-100 comments (unless it's a 'big' story). So it's not the traffic for sure, no matter how much they want to delude themselves about the 10 million invisible subs or whatever.
It's the press. Because like it or not they would be making a pretty big political statement by banning the sub, the main 'supporters club' for the currently sitting US president. Whatever ban reason will be picked apart and spun by either side of the press and the whole thing will become a big unmanageable mess, which is not something anyone would want to trigger. Better to wait it out until, for whatever reason, he's no longer the president. At that point I'm sure the rules regarding death threats, racism and witch hunting will be enforced a bit less leniently.
> No one forces you to go to a sub, and you can even filter them from the front page without extensions now.
Most people never come across The_Donald organically anymore anyway. Whether reddit tweaked their algo to stop that from happening or not, doesn't really matter because the objective of these people is preventing anyone from having an opinion that doesn't align with theirs.
I think his point is that the mods of the sub have done their best to enforce the rules, despite missing a handful of very lowly upvoted comments here and there.
Thus "they" (the people who run the sub) have not broken any rules, just some bad actors which presumably would have been dealt with if their posts got more attention.
If random users making bad posts that almost no one notices in highly active subs constitutes a total subreddit ban, then all subs would end up banned.
T_d and similar regularly sees posts that break the rules, at rates higher than other subs, that are highly upvoted, it's not just a few low-scoring bad apples. Reddit won't ban them because they want the traffic. "Well they took some down after they got caught" is a sorry excuse.
Perhaps so but the post of examples linked by GP has the vast majority of offending posts upvoted <= 5. Only 9 were >= 10 and just 3 >= 20. Really? This is a sub where popular posts and comments get hundreds and thousands of upvotes. Could they not have compiled a better list? It must be easy if the place is as bad as we're all told.
But I agree, reddit just wants the traffic, and r/the_donald does deliver on that for them.
Arguably the same rules any subreddit with over a dozen members has broken at some point. You'll probably be linked to a list of posts with few/no upvotes that make "calls to violence". You can easily compile a list like this for any major subreddit on any subject, but people like to harp on T_D because it represents a bias they don't care for.
I don’t know about rules but I left reddit because it kind of sucked seeing their posts calling everyone cucks every single day. That was literally like half of the top posts on any given day leading up to the election.
Censorship is not something only governments can do.
I think we’re well beyond the point it’s time to have a national conversation about how these private companies exist as the public commons nowadays. They make these entirely capricious and disgustingly transparent swings between “Place for open discussion and free speech” (actually the founders exact words in many an interview in Reddit’s case) when the owners want attention and hype and warm fuzzies from the user base, “private company” when it suits their needs or they’re challenged on their principles.
Legally this is most likely correct. But you can't have your cake and eat it to. If you support political censorship (which you might have a good reason to do), then you have to own that. You can't support political censorship and then justify it by saying it's not censorship.
First of all, "censorship" is an extremely loaded word because it carries the implication that the message being broadcast necessarily deserves to be heard, but this is not always the case. If I logged onto your blog and wrote "fuck you" in the comments, you wouldn't be censoring me by deleting it.
The world censorship also carries implications about the motives of the supposed "censor" by alluding to the notion that their actions are motivated by a desire to silence a message that they fear or disagree with when other explanations could be more likley. For example, maybe you don't care if the message is heard, but you just don't want to hear it in your house... if you then kick me out your house, that doesn't mean I'm being censored, it means that what happens on your property is your prerogative. Maybe you didn't even hear the message and just don't like my tone or maybe you did hear the message and find it to be worthless, it's not censorship if I ban you for stating that aliens orchestrated 9/11.
Banning anyone from reddit is not censorship because they were never entitled to the use of reddit's privately owned property and reddit's actions do not impede their free expression through the use of their own resources.
Reddit is a private organization that can obviously decide what they do or don't allow on their site.
That said I am still interested in if the images created by this technology are challenged legally. The adult actors created and released these videos with their consent (presumably), and the facial reconstruction of the celebrities are usually drawn from public sources (Facebook, Instagram, news articles).
Could they claim copyright on these videos? It's hard to argue that they aren't transformative.
Supposedly some are challenged on the grounds of defamation or slander, claiming that these images are falsely claiming that celebrities performed these sex acts. However, this seems like a very weak claim given that many of the places hosting these images are transparent about the fact that they are fabricated.
This specific application of the technology isn't something I care that much about, but I think it has the potential to set significant precedent about what sorts of derivative works are permissible with technology that can create fake media that is increasingly hard to distinguish from reality.
You can't buy a copy of Windows(R)(TM)(C) and make some patching (or DeepPatching) to change the icons and background and some of the txt, and then release/sell it. Even if Windows was released voluntarily by Microsoft. Even if you use photos of celebrities instead of icons.
[You neither can't modify Linux and release it with another license, like a proprietary license, BSD, GPL3, ... It also has a very specific license.]
The actors/directors/whatever released the videos with a license, that probably doesn't allow redistribution and modifications. I'm not sure about the license of the photos in Facebook/Instagram, but the photos in newspaper usually have a restrictive copyright.
Perhaps there is a loophole using very old videos, and old photos, but ... IANAL.(Wikipedia says something about 1923, the Mickey Mouse video was made in 1928 and still is under copyright.)
Fair use is handled quite differently with media as opposed to software products, and derivative works can often be legal. Derivative works can be exempt from copyright if they "uses a source work in completely new or unexpected ways". Additionally, pornographic parodies of public figures has a long history dating back centuries at least - explicit depictions of Marie Antionette during the French Revolution are an example off the top of my head. In fact, when I Google "Fair Use Transformative" a face-swap of the Mona Lisa is one of the first results I get.[2] I am not confident that a video or GIF would be treated differently than a still image by the courts. Additionally, the "Supreme Court has indicated that offensiveness is not a fair use factor" [3] so there's a good chance that moral outrage over deepfakes is not likely to be enough to criminalize it.
That said, hosting sites invariably have clauses that state they can remove any content they don't like so this content is probably going to be relegated to 4chan and whatnot. As stated in the root comment, I'm fine with companies taking that prerogative. I dislike to notion of people faces - arguably their identities - being swapped onto objectionable or embarrassing content but at the same time I don't see it being banned given the current precedence.
Being 500 years old, the Mona Lisa is probably out of all rights. The French Revolution is also not a great time to measure the enforcement of laws - she was probably more worried about keeping her head on her shoulders than about lewd pictures of her.
Copyright does have fair use exceptions, but some countries and US states also have personality rights that control the use of one's image. Hulk Hogan's case against Gawker, for exemple, was not own on copyright grounds, as far as I can tell.
I think it's like translating a movie or a book. You must have the authorization of the copyright holders (or hope that nobody cares or try to fly under the radar).
The "new and unexpected" part will fade away in a month.
Outside of the main use case right now, I hope this technology continues to be developed and made more user friendly.
I'm looking forward to the day when scanning you and your friends faces into a cool movie or game is commonplace.
Yes there are concerns that need to be addressed, but I hope we don't throw the baby out with the bathwater.
To that end, is anyone aware of any projects for movies or gaming that are taking this approach? I know it is already a thing for high budget Hollywood movies... I'm more interested in it becoming consumer friendly.
Reddit is almost comically morally bankrupt.
reply