I sait on another comment on this article that being a mod allows you to strongly astroturf and control the narrative. From there, and having loads of bots, allows you to manufacture consent or dissent. And the moderation power allows you to remove what you wish not to address.
It's sheer power. It's not about the money, per se... But those with power get money, and those with money seek power.
It has nothing to do with good, in most cases.
(And yes, I'm a moderator of small groups. I just remove spam and malware.)
Moderation is cheap. The price the consumer pays for it is having content be removed/filtered/edited because it violates a submission guideline, ToS, or a moderator's worldview. Usually the latter. I don't think you can quantify that value personally. Definitely worth more than a few million dollars for the potential reach.
They spend a lot moderating because their platforms largely allow anybody to spam/harass anybody else with no barriers. I suspect the lack of compartmentalization like Reddit or Mastodon is because of money. There is greater engagement with fewer barriers but more potential for abuse.
Moderation was probably the wrong word, more in the sense of controlling faked content posted by accounts that are not real people. Managing Alex Jones official account is about the same, but fake accounts/bots become harder.
What qualities do you think that would incentivize in a moderation system? I'm not feeling that tendency to share less misinformation is one of them, and I'd fear that richer people would have more power to control the narrative in such a system by effectively bribing moderators.
Apologies if there is something about "nano-payment-funded distributed and syndicated moderation" that I'm not getting.
This question needs a lot more context into what you mean by moderation, but maybe to try and narrow it down: make spam so unprofitable no one does it.
We get so stuck in the “offense” piece, but there is a veritable army of people in the world trying to make the internet as unpleasant as possible. Look at the average comment thread in response to a tweet musk or another famous figure. It’s garbage memes, crypto spams, low effort content and impersonators. If “personal agency” means doing a whole bunch of busy work to sift through a river of shit to get to the occasional gold nugget…
I’ve stopped using Reddit because so many of the comments are just low effort or uninteresting. So then what do you mean by personal agency, in a world where bad actors: spammers, scammers, and trolls are participating alongside the rest of us.
Part of the point, as I understand it, is that moderation scales more-or-less linearly because each instance is ideally human-scale (that is, maybe having hundreds of members as opposed to millions) and responsible for its own moderation, unlike sites like Twitter or Facebook which have to pay a relatively small team of moderators to moderate millions of people's communications.
There's nothing saying any particular instance will moderate, but if they don't and it becomes a problem they'll quickly find themselves cut off from federating with other instances.
Moderation absolutely can scale, platforms just don't want to pay for it. For two reasons:
- Moderation is a 'cost center', which is MBAspeak for "thing that doesn't provide immediate returns disproportional to investment". For context, so is engineering (us). So instead of paying a reasonable amount to hire moderators, Facebook and other platforms will spend as little as possible and barely do anything. This mentality tends to be enforced early on in the growth phase where users are being added way faster than you can afford to add moderators, but remains even after sustainable revenues have been discovered and you have plenty of money to hire people with.
- Certain types of against-the-rules posts provide a benefit to the platform hosting them. Copyright infringement is an obvious example, but that has liability associated to it, so platforms will at least pretend to care. More subtle would be things like outrage bait and political misinformation. You can hook people for life with that shit. Why would you pay money to hire people to punish your best posters?
That last one dovetails with certain calls for "free speech" online. The thing is, while all the content people want removed is harmful to users, some of it is actually beneficial to the platform. Any institutional support for freedom of speech by social media companies is motivated not by a high-minded support for liberal values, but by the fact that it's an excuse to cut moderation budgets and publish more lurid garbage.
Naive answer is a percentage of monthly advertising or subscription revenue, split amongst the moderators based on the proportion of the moderation work they have undertaken.
However, this creates a perverse incentive for moderators to perform unnecessary additional moderation work to bring in more compensation.
I'm torn between disgust and embarrassment for the people providing free, unofficial support for trillion dollar tech companies via Reddit Requests. They literally get nothing, no job placement, no recognition, no pay or benefits, and no way to be made whole for providing services that have a real, demonstrable impact on company perception and operation.
This is why large instances like mastodon.social are basically a bad idea–at that scale they have all the same content moderation woes as Twitter or Meta.
On a smaller instance, it's much easier for a mod to nip bad behavior in the bud—and less politically fraught, because the moderation decision doesn't have worldwide policy implications. It's the difference between being kicked out of a local coffeeshop vs arrested in the town square.
If you can't afford to pay a sufficient number of people to moderate a group, you need to reduce the size of the group or increase the number of moderators.
Your speculation implies no responsibility for taking on more than can be handled responsibly, and externalizes the consequences to society at large.
There are responsible ways to have very clear, bright, easily understood, well communicated rules and sufficient staff to manage a community. I don't know why it's simply accepted that giant social networks get to play these games when it's calculated, cold economics driving the bad decisions.
They make enough money to afford responsible moderation. They just don't have to spend that money, and they beg off responsibility for user misbehavior and automated abuses, wring their hands, and claim "we do the best we can!"
If they honestly can't use their billions of adtech revenue to responsibly moderate communities, then maybe they shouldn't exist.
Maybe we need to legislate something to the effect of "get as big as you want, as long as you can do it responsibly, and here are the guidelines for responsible community management..."
Absent such legislation, there's no possible change until AI is able to reasonably do the moderation work of a human. Which may be sooner than any efforts at legislation, at this rate.
As the GP stated repeatedly, it doesn't scale. Number of interactions scales with the factorial of users. The moderation team itself also doesn't scale. Good moderation is very difficult and requires trust. More moderators spreads the trust thin and greatly increases the chance that you end up with one or more bad moderators, who in turn damage that trust.
It creates an incentive to improve moderation automation and reduce moderation costs. One way to do that is to verify users, which will help with our sock puppet and deliberate misinformation problems.
Sort of tongue in cheek, but the visible effects of moderation are awful in 90% of circumstances across the internet. Granted we don't see the benefit of the non-visible parts, but on reddit especially there are endless stories of communities being ruined by moderation, and moderation being used to force community behavior that suit the moderator's agendas.
There instances, like HN, where the platform isn't trying to be used for making money or pushing an agenda. Moderation on this tends to be good, but that set of circumstances is rare
It's sheer power. It's not about the money, per se... But those with power get money, and those with money seek power.
It has nothing to do with good, in most cases.
(And yes, I'm a moderator of small groups. I just remove spam and malware.)
reply