I only skimmed the second part, but this doesn't even seem to prevent the actions it complains about.
It's also interesting to think about reddit and subreddits. When a moderator of a subreddit engages in the behavior described here, deleting comments and posts and banning users arbitrarily, stickying editorial commentary in threads, are they acting as publishers? Are the mods now legally liable in the section 230 sense? If not, is reddit a publisher and liable even though they aren't doing the editing? If reddit can enable mods to do it without reddit or the mods being liable, why can't reddit just do it without being liable?
I'm pretty suspicious of this. Reddit seems to be couching this in terms of "Oh no! Our poor moderators are going to be held legally liable for moderating sub-reddits". But that doesn't seem to even slightly be true here. The claim in the original case is that Youtube is liable not because they publish the content, but because they promoted the content. Moderators don't have that power, they can remove stuff (clearly protected by section 230) but generally don't promote stuff (with the 1 exception being pinning posts, which arguably yes, they should be liable for). The real people at risk here are... reddit who design the algorithm to decide what shows up on the reddit homepage and which content they promote.
I'm pretty neutral on the argument of whether reddit should be liable for content they promote, I can see an argument both ways, but I am strongly against Reddit trying to whip up their users into a mob to protest about this by misleading them.
> if Section 230 is weakened by the court. Unlike other companies that hire content moderators, the content that Reddit displays is “primarily driven by humans—not by centralized algorithms.”
Not true. Reddit mods enforce reddit rules, primarily. These rules have grown increasingly onerous over time. There is no more restrictive ruleset on reddit than those imposed by reddit inc. itself. This includes moderation by automoderation scripts written and enabled by volunteer moderators, at the behest(and their own) of reddit.
Indeed, if reddit moderators do not enforce the rules of reddit then they will lose their positions as moderators and likely be banned as well.
Still haven't found the comment! I'm no lawyer, but iirc it was that paid moderators making content approvals would end up turning them into publishers, where bc they're volunteers they're currently 3rd parties to Reddit, so if they approve illegal content now it can't hurt Reddit because of section 230.
Basically because mods don't just remove, but they approve content, paying them as 1st parties gets you into risky "publisher" territory.
edit: YOU FOUND IT. thank you!! your search-fu is a long lock of hair and mine is but an eyelash
I would agree that’s true for most smaller subreddits, but I'm pretty sure the Reddit admins have a non-trivial amount of interaction with the mods of the largest and most central subreddits. I'd also agree that if this went before a court, I would expect it to find that the mods did not qualify, but the fact that the law seems vague enough to require clarification by a court is itself a problem.
There is a fuzzy difference between how sites Reddit and Stack Exchange operate, but if Stack Exchange is violating labor laws, what change to Reddit might move it across that fuzzy line? It puts a site like Reddit in an odd position where involving themselves more directly in the moderation of their own site, in terms of requirements for subreddits and mods, tools or directions/expectations for how communities are moderated, etc, could move them into a position of violating those laws.
Reddit has expectations of what moderators are to do, and has expectations of what they are not to do, and will remove them from roles if they fail to meet those expectations. That set of expectations would make them employees if compensated.
As for liability, the Ninth Circuit in Mavrix v LiveJournal held that if an agent of a user-content-hosting ISP (social media) has the means and opportunity to moderate, they also have the means and opportunity to interdict reasonably known copyright violations, and failure to act on those would jeopardise their DMCA Safe Harbour.
And there’s a lot of registered copyright holders that will 100% line up to be a creditor on statutory damages.
This article goes above and beyond to make it look like Reddit owes stuff to the moderators who are not their employees and there is no contractual relationship between the two parties.
The moderators aren't forced to be moderators, they can stop being mods or just move to a different online community.
What's happening here is that the moderators are seeking more control over the platform.
>The fact Reddit didn't allow bad moderators to be removed played into it, too
I tried to "play the tape through" on this, and imagine what it looks like. If Reddit had proactively tried to implement some kind of intervention system to remove abusive moderators from subreddits.
It puts Reddit as an organization in a tough position, because they have to mediate disputes. And it opens up a bunch of other questions. How big does a subreddit have to be before they qualify for this intervention system? Does one submit an organized case like filing a lawsuit? What if the accused moderator is the only moderator of the subreddit?
And what new rules does Reddit add to try to curb abusive moderators? There'd have to be some kind of ethical guidelines drawn up. So now unpaid moderators can be disbarred like lawyers?
Every thing I think of (admittedly not having spent a ton of time thinking about this) is open to potential abuse. Even if you set the trigger to be a 90% vote of "no confidence" by subscribed members of the subreddit, the subreddit would be vulnerable to "raids". If you gate it to "you have to have been subscribed for X amount of time" the time will either be too short and leave raiding or create a new "class" of "landed gentry" that have a disproportionate say in how the subreddit is run.
So I don't know if it's actually a failure. Some of the stuff Reddit did early on, things like vote fuzzing etc... show a willingness to experiment with using the platform itself to enforce better behavior. They might have been able to figure something out, eventually. But then I think of really simple stuff that even Hacker News has implemented (ex. you can't downvote a direct reply to one of your comments) and how Reddit doesn't have that.
Indeed. The problem with imposing liability on moderators is that they're already doing about as well as they reasonably can at a job that isn't easy. Nobody wants a platform full of spam and disinformation.
But it's inherently a difficult trade off between heavy-handed censorship that catches too many dolphins in the shark net vs. not catching strictly 100% of the bad stuff. If you start imposing liability on the moderators then it forces the trade off into an all or nothing -- either they give up and stop moderating whatsoever or they have to murder all the dolphins because now a single shark sighting puts them out of business and you can't always tell the difference.
It also eliminates the possibility for different platforms to experiment with making the trade off in different ways. Maybe The New York Times wants to have an editor read every user comment before posting it but Reddit has a stronger commitment to free speech. Shouldn't we have both and let the different readers make their choices? Isn't that better than locking in the same compromised criteria for everybody?
Reddit won't create a situation where they have
oversight capabilities for mods... that creates
too much liability for them
It's definitely interesting to watch how they walk that tightrope. Because they certainly do remove mods that are inactive or problematic, like how they are threatening to remove mods that have taken their subreddits dark.
That's just not true. This is not how reddit works. There are other rules in place that make it clear that mods don't own the subreddits that they mod. They are stewards of that community, but when they try to sabotage that community they can and should be removed.
Sorry, but the mods are dictators-for-life there and there is no appeal. If you ask Reddit for help when the mods violate the moderator code of conduct, they will just say that they are just tracking reports and to use some other community instead which isn't helpful if the mods control a large one in the millions.
I don't think the sky is falling, I do think the mods should be accountable for their actions controlling content based on their personal feelings. Non-accountability is why I am fine with Reddit losing section 230. My opinion can be changed if Reddit Inc. puts in rules and procedures and allows a process where you can actually interact over a problem.
I do not think the solution to dealing with an idiot mod should be to delete the account and make another one.
Author has no clue what they're talking about and it shows. Moderators are under absolutely no obligation to enforce Reddits anti-spam policy and I am aware of zero instances of a moderator being removed or warned for failure to remove spam.
Now moderators ARE held accountable for some things, they can't just turn a blind eye to global rules like no child pornography without being removed, and in general they're expected to keep their communities under control. If their community in particular starts harassing other communities either on or off of Reddit, or it brings Public disrepute to reddit in some way (/r/jailbait), generally the admins will get involved. However the admins don't care if you don't remove spam for your own subreddit for a very simple reason - it only affects users ON YOUR SUBREDDIT so why would they care or be involved? The commonality between all the cases I just mentioned is you're causing trouble OUTSIDE your community and causing Reddit as a whole problems, THAT'S when the admins get involved. They are like the federal government of reddit they're not supposed to care about internal affairs of states they're supposed to care about inter-state affairs and international affairs.
Another important thing to realise about reddit is most of the moderation isn't done by moderators or admins, there are way too few of them to moderate effectively. The author is correct about one thing - were those Reddit's only moderation tools even with automation scaling moderation would be impossible, but they're not the only moderation tools Reddit has. Most moderation is done through the influence of users soft-censoring posts through downvoting and reporting posts to draw the attention to the mods. Something the author didn't even write about or seem to think about is the fact the mod probably only looked at his post BECAUSE a user reported it.
In general though, Reddit at a global admin level has long blocked users for self-promotion and they even have a writeup about it. I have seen countless users, even ones generally well received by communities banned for self-promotion, even ones whose posts I kept promoting after the original posters account got banned because I liked their posts so much (and I know how to not get banned for spamming on Reddit), because one mans useful post is another man's spam and to be fair you have to just ban all self-promoters. Reddit has writeup about this: https://www.reddit.com/wiki/selfpromotion/
FTA: "But my article is not the type of spam these rules are meant to exclude"
FT rules: "We're not making a judgement on your quality, just your behavior on reddit." "10% or less of your posting and conversation should link to your own content"
So I would say if anything, the scalability problem Reddit has is that authors like the OP can't read the fucking rules, break rules, get banned, and cry about it while citing a misinformed view of how Reddit moderation works which isn't surprising given he didn't even bother to read the rules before publishing this crap.
I read a comment on one of the many other threads over the last week that paying moderators changes Reddit's liability under section 230 to be stricter. With volunteer moderators, the mods themselves are considered users instead of employees of the company and that apparently matters in a tangible sense.
Trying to find the comment since I'm not the expert who was mentioning, but it's likely not that easy/why they haven't just done this already, not counting the extra cost.
What is the analogy supposed to be for the moderators then? You have a decentralized service, hosted on IPFS or something. You can't sue the service, but that was never the issue. Nobody is calling for liability on AWS for hosting a forum moderated by somebody else. It's the moderators that Section 230 is protecting. And even if you use decentralized hosting, to have moderation the moderators still have to exist, do they not?
So, reddit supports reddit mods, that offer free, uncompensated labor for subreddit communities.
And no one questions that, or how subreddits can skew opinion or possible conflict of interest with some mods directly referring or supporting paid content while masquarading as if they aren't related to the product.
Or, mods that dictate art isn't art, or that an artist didn't create something.
As much as I do enjoy how the Internet flourished because of 230, I don't enjoy how there is no transparency in someone who can shape content of an online community, especially forwarding their own opinions/thoughts or have their own agenda.
I don't see how the above issue of Reddit mods conflict of interests is not a 230 issue.
I actually think this is a really interesting case.
Reddit moderation is absolutely horrendous and subreddit mods have near complete power even when they go off the rails and start banning people for unsubstantiated reasons. There is no recourse and Reddit won’t do anything about it.
IMO outsourcing moderation to subreddit mods with no ability for individual users to appeal is very anti-user and fundamentally unfair. Worse moderation system than more “centralized” platforms like Twitter or FB.
It's also interesting to think about reddit and subreddits. When a moderator of a subreddit engages in the behavior described here, deleting comments and posts and banning users arbitrarily, stickying editorial commentary in threads, are they acting as publishers? Are the mods now legally liable in the section 230 sense? If not, is reddit a publisher and liable even though they aren't doing the editing? If reddit can enable mods to do it without reddit or the mods being liable, why can't reddit just do it without being liable?
reply