Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Facebook can easily afford 100,000 moderators. If they don't want to do what it takes to make their platform safe they should expect legal intervention.


sort by: page size:

If you have the ability to moderate and modify posts on your platform, everyone and their mother is going to come after you and try to force you to exercise that power. It looks like Facebook gets to find out what happens when governments try to make you do that.

Why are you implying that Facebook isn't actively moderating their platforms or that governments would be able to do a better job? Facebook has hired thousands of human moderators in the past few years, significantly more than Google and Twitter. They've also invested significantly in improving their AI detection systems. That doesn't mean there isn't work to do still, but you're insinuating that they aren't actively moderating their platforms and that simply isn't true. Furthermore, the idea that harsh penalties and regulations are going to prevent psychopaths from broadcasting their killing spree is ridiculous. Facebook doesn't need an incentive to prevent mass shootings from being broadcast on its platform, and punishing them with monetary fines or regulations would only hurt their ability to do so.

Exactly this. I work in C2C e-commerce which in my jurisdiction has very strict laws around ensuring we do due diligence on the people we allow onto our platform. If we didn't we would be finished. The only way to get Facebook to start taking this more seriously is through more regulation.

Facebook's business model seems to me the most fragile one. They rely a lot on automated moderation and are seriously understaffed in the "human moderator" segment.

It is likely that various countries will push Facebook around and force it to somehow respect local laws regarding freedom of speech or criminalization of hate speech. This needs a lot of human moderators whose aggregated compensation might sink the company, unless it starts charging the users.

But charging the users turns them into customers and makes them way more assertive against their business partners in disputes. Yet another source of extra costs.


This is a much better approach than telling FB how to stop abuse. Instead it's telling FB what to stop and letting FB (which knows its medium far better than anyone else) develop a mechanism to do so. Politicians and government bureaus would flounder around attempting to interfere in how FB works.

If facebook gives in to moderation based on silicon valley politics they're done for.

"Facebook is a private platform. It can moderate how it pleases."

Absolutely not; it is highly regulated (probably should be more so), subject to constitutional concerns, ongoing legislation and pressures.


FB would absolutely love imposed regulation.

If Facebook can't moderate properly because it is too expensive, then Facebook can't exist. We can go back to a more plural world of many specialist forums with community moderation. That way the content properly reflects the society that generates it.

I don't think we want facebook making these decisions for themselves, but without adequate regulatory guidelines, they are in a no win situation.

Good. They're free to do that as a private company. If Facebook is so important that they shouldn't be allowed to moderate their own platform then designate them a utility.

In addition: You need reasonably trustworthy entities if you want them to self-regulate and self-certify.

Facebook, with their behavior over the years, have lost any benefit of a doubt.

This may not be a popular opinion here, but given their behavior, their evasions and their lies it's time for the law to crack down on them; hard!


If the argument is that facebook is too big to be regulated by anyone, even facebook, then it's time to break it down into smaller and more manageable parts.

To actually impact Facebook and spur positive change they'll need to impose a much higher penalty - perhaps this is out of the domain of what courts can reasonably enforce and we need the legislature to take action.

On the one hand, I believe in safe harbor & don't think a company is responsible for policing itself perfectly. There's a lot of hard to understand situations across the world, and then understanding people's use of a language is radically harder still. It's terrible, but bad is going to keep happening, and it's not just Facebook's fault problem. Societies need a realistic expectation of what they might expect, need their own stake in how they will help steer moderation & pick what moderation happens. This is a lose lose game, and it's one society can't just keep flunking social media on: it has to have it's own skin in the game.

On the other hand, a company which cannot be even the faintest bit responsible for it's platform has an obligation to allow users their own defense, has to promote society getting it's own skin in the game. Facebook fails at this miserably. It retains a worthless, undignified pretense of authority- has their own moderation system- yet it fails, flails, & is perpetually incompetent. Allowing the people to form their own defenses on social networks seems a requirement for civilization. We can't & shouldn't allow sole arbiters to decide, particularly not one's that are so incompetent. It seems unlikely any government would do any better. To me, this implies that we need to open the doors, throw wide the possibilities of moderation, and let users begin to devise their own moderation systems & safeguards. Facebook has fallen far short of the base, bare minimum, but I don't think they should have to figure out how to police every engagement, don't see how they could fulfill the minimum. Rescinding connectivity to the world does not seem fit. But their reckless failure signals clearly their abdication, signals that the platform must be open to other more active & vigilant forces. The platform needs to open, for the base minimum protection of the world.


Facebook so frequently breaks the law that the US government would be well within its rights to simply seize the entire operation through eminent domain, freeze all assets, and prosecute all of its executives and senior management.

Who gives Facebook the authority to do so?

Let's assume Facebook does invest into policing their platform to the extent necessary for it to not result in political consequences ("misinformation and disinformation"). There's still the rest of the internet. There's still all the open-source tools available that can be used for good or ill. The problem is much larger than Facebook.

But a private company should enforce all the other things that Facebook has no problem moderating?

Facebook has already said they have no problem enforcing this for non-politicians, after all.

next

Legal | privacy