I don’t see what the argument is that the marketplace of ideas isn’t working.
Facebook doesn’t prevent you from using Twitter, or Reddit or a small new network. All kinds of people can and do engage with multiple networks, big and small, at their own inclination. All of the platforms are accessed on the same internet using the same protocols and clients, and it’s trivial — completely trivial — to find alternate content or all kinds (at least in the US).
People aren’t abandoning the big platforms in droves because they generally like those big platforms. (Probably almost nobody thinks they are great, but that’s not the standard. The standard is better than the alternatives.)
As the article points out, the marketplace of ideas is leading Twitter to more flexible moderation mechanisms that let users choose their own level.
Seems crazy to change the rules, force companies to break their business models, when the system is working fine.
Doctorow (and others) may not like it, but many, many people like Facebook and Twitter a lot more than they dislike them.
Protective regulation should come into play when the affected populace is not in any position to consent to (or decline) the situation. E.g., because alternatives aren’t available, or because the issue requires special expertise to understand, etc. But generally this doesn’t apply here. (I do think there needs to be recourse for people who are banned or otherwise lose access to something they’ve invested in.)
Furthermore, if Facebook or Twitter are so large that they function as the new public space, the answer isn't to prevent them from banning people, but to break them up. The issue isn't a private company determining who gets to be on their platform, but that their platform is effectively a monopoly and monopolies are bad.
These types of social interactions aren’t fungible. There are a finite number of viable social interactions to be discovered. Once discovered, network effects push towards consolidation to one platform offering that experience.
If you consider “social media” as a market, it has healthy competitive landscape. If you consider different styles of social interaction as separate markets, they’ve cornered markets. I don’t see competition in these spaces. Facebook != Twitter, and I feel that is why both can exist. Behemoths in neighboring spaces opt to buy a social experience instead of trying to compete with their own.
The missing piece, IMO, isn’t regulation around “censorship” for these platforms. It’s regulation that results in a rich market of products around a single style of social interaction. Example: regulation around interoperability.
If the problem with Twitter/Facebook is that they have too much influence (a fair point IMO) and thus need regulation shouldn't we solve that monopoly problem rather than trying to remove the right of forums to choose what content they host?
In those cases the solution isn’t to force those companies to moderate their content differently, but to prevent them from becoming de-facto monopolies.
The popularity of Twitter, Facebook, Amazon and Google’s core products isn’t the issue. The issue is that any time a successful competitor comes up, they can just buy them out.
Imagine if Facebook wasn’t able to buy Instagram, and it had survived as a competing platform?
There’s no need to apply the concept of “free speech” to private companies. There is every need to regulate monopolies so that a handful of tech giants don’t have the power to effectively suppress content across the majority of the outlets people are using every day.
It’s certainly not true that Facebook has a monopoly on communication (here we are communicating without Facebook involved)
I do have some agreement that when a platform gets very large, it makes sense to have some regulation around the censorship decisions they take. Specifically, rules around transparency make the most sense to me. On the other hand, a blunt regulation such as saying that large tech platforms are not allowed to do any platform or country-wide moderation doesn’t make as much sense to me. My experience with all unmoderated online communities is that they devolve into something extremely unpleasant, and forcing Facebook to push ALL of the moderation work down to specific groups or users themselves is not a workable approach to this problem.
Sometimes I feel that the free market argument doesn't apply to media platforms. This is because the effectiveness of a media platform (or any mass media, for that matter) is largely determined by its market/usage share; in other words, smaller or alternative media platforms cannot be considered as a working substitution to the dominate platforms. And since the user distribution of social media tends to follow the power law (i.e. the "can't switch 'cause all my friends are using it and they won't switch due to the same reason" phenomenon), it is almost inevitable that a single platform will eventually monopolise the sector if left unregulated. banning a user from using the most popular media channel would mean he/she is no longer able to communicate his/her opinion effectively even he/her is able to choose a less popular alternative.
tl;dr: one can always switch their phone if they don't like Apple because the usefulness of a phone doesn't depend on its popularity; one can't practically switch their social media platform if they are banned because the usefulness of the social media depends on its popularity, and the most popular platform typically dominates due to the power law.
I think this is a very compelling point, and much better than most of the "fediverse is for freedom" talking points I have seen on HN:
> I have participated in many a public forum on Internet governance, and whenever anyone pointed out that social platforms like Facebook need to do more as far as content moderation is concerned, Facebook would complain that it’s difficult in their huge network, since regulation and cultures are so different across the world.
> They’re not wrong! But while their goal was to stifle further regulation, they were in fact making a very good argument for decentralisation.
> After all the very reason they are in this “difficult position” is their business decision to insist on providing centrally-controlled global social media platforms, trying to push the round peg of a myriad of cultures into a square hole of a single moderation policy.
I think if there was a more robust social media market instead of a Facebook/Twitter duopoly, we would have seen at least _some_ platforms move to restrict things sooner, and maybe even a majority of platforms. That would have kept us from hitting a moment so fraught that the President ended up banned from all social media overnight.
So then it potentially becomes impossible to create a startup that replaces existing social networks due to moderation requirements. Facebook and twitter got lucky they didn't get killed this way in the early days. This is similar to the criticism on the EU automated copyright violation detect requirements.
If moderation laws essentially make it impossible to build new mass platforms, then that creates a very strong case for having rights to have unrestricted open access to existing platforms, does it not? For example, the networks then wouldn't be able to issue lifetime bans at their discretion.
Isn't the free-market answer to this problem for users to move to other social media platforms that moderate in a way they prefer? The problem here is how powerful and walled-off Facebook and Twitter are, making competing difficult if not impossible. Perhaps if we solve that problem, everyone can get what they want.
I think saying that, on principle, companies should not moderate content at all is equally absurd as it would allow malware, CP, abusive content, and spam to run rampant. All we're really arguing about here is to what extent do we want these platforms to moderate content. Should they be limited to only removing illegal content? What's the line on "illegal" (no company could afford to consult lawyers for every post they remove)? What about spam, which is not necessarily illegal but disruptive to the service?
Absolutely there is a problem. That was my point. If we fix the monopolization of these industries, we solve the moderation issue as well via the free market.
Obviously solving the kind of monopolies created by social networks is hard. The best proposals I've heard is forcing them to open up their social graphs/APIs to competitors, but that's not without its own issues (e.g. bad actors siphoning off user data, like Cambridge Analytica).
I am very anti-social-media-regulation. Partly for the reasons you mention and partly because I see greater regulation balkanizing the internet and driving us increasingly farther from the promise of an egalitarian open internet.
As for alternatives, I think we just need people to collectively decide that some other platform (ideally a decentralized one) is better than the incumbent. Facebook depends on its inertia. Suppose every Facebook use went cold turkey and switched to something else instead (let's say Mastodon for the sake of argument). In a year, nobody would be talking about Facebook's monopoly.
Where I think things get sticky right now, though, and I'll even say -the- reason we haven't seen innovation in social media, is that incumbents on the scale of Facebook have the capital sufficient to either buy or sue any plausible competition into the ground before the competition has a chance at taking their market share. Imagine a world where Facebook had been blocked from burying Instagram and WhatsApp with money!
I think I would be in favor of greater regulation against these winner-takes-all tactics on a more economic level, although exactly how that regulation would work in a way that was both fair and non-trivial to evade I don't know.
You are of course correct in a grand sense, but there's this weird gray area philosophically where someone builds a town square and says "everyone come and talk in my square"...until the hear what you are speaking about!
The problem is likely greater in our corporate/mercantilist system where some state-sponsored and some natural monopolies develop. Totally get and appreciate the private property of the corporation argument...it is their town square after all...but when there is only one that is 'allowed' to succeed, what happens?
This then leads to specific tactical rules we have on the books in the US like FTC and even FCC regs, where some may protest unfair commerce.
IANAL but I can see this is a real mess. Also, what is the ROI for Twitter when it comes to policing this? We know FB has been having problems.
While I agree that Net Neutrality is necessary, the argument given in this EFF article seems to contradict itself with regard to FB, Twitter, et al. The article notes that social networking sites have been used by people to coordinate in order to get their viewpoints heard. But then it says:
"What does this have to do with net neutrality? Simple: all of these services depend the existence of open communications protocols that let us innovate without having to ask permission from any company or government."
But anyone who uses FB or Twitter does have to "ask permission" from those companies: the companies own the sites, not the users. And the idea that FB, Twitter, et al are based on "open communications protocols" is obviously wrong.
If the argument is that FB, Twitter, et al should have to use open communications protocols, let anyone use them without asking permission, etc., then that is an argument for ending FB, Twitter, et al as private companies and making them public utilities run by the government. I'm not sure that's a good solution. It seems to me that a much better idea would be to encourage competition in free speech platforms on the Internet, so that people do not have to depend on FB, Twitter, et al to coordinate and get their viewpoints heard.
I don't know - the only real counterargument I've seen is that Facebook and Twitter are private entities, so they should be able to operate however they like. I could use very similar reasoning to reason that the government has no mandate to break up monopolistic businesses, either.
As someone who doesn't use Facebook, Twitter, or their subsidiaries, the debate is kinda laughable. Sure break them up for antitrust. Write privacy regulation. But regulating the content shared there? If you don't like it, it's so easy to opt out, and it feels great. People write like these are utilities necessary for a good life, which sounds crazy from the outside.
Given the concentration of user-generated content on these platforms' market regulation is needed. Twitter/Facebook have a monopoly by virtue of the "moat". Once someone generates their own content it's only accessible through one platform - that is a monopoly. This market needs innovation and competition. Don't break them up just force them to compete by removing the moat.
You’re getting hung up on the analogy. Whatever things like Twitter and Facebook are, they seem to be distinctly new phenomena that have a critical role to play in upholding the principles of freedom of speech that underpins things like liberal democracy, given their scale and the network effects of that scale. It’s important to recognize this for what it is and incorporate it into society intentionally to maximize the net benefits to humanity. The current approach where the marketplace of ideas seems to be tightening given the emergent behavior of these companies (and the public pressure to do so) implies to me we need a better regulatory regime or technology stack if we want to ensure liberalism continues to prosper.
> If you get big enough, either you get broken up, or you have to become a regulated monopoly.
Some products cannot be easily broken up without disrupting the actual product. For instance, Facebook could not be broken up since the value of the product is in large part due to having one significantly sized and unified user base (network effects). But these companies can't have it both ways - they can't claim that they operate in a competitive environment with few barriers for new competition while also claiming that splitting up their user base would destroy their unique product offering.
We also need to be wary of market share arguments, especially given that these companies largely operate in the Bay Area and reflects its values/political culture/etc. This is why we regularly see them enact censorship in lock-step. Even if several companies operate with less-than-majority market share, they can behave as a cartel. That's why we shouldn't treat Twitter and Facebook and Tik Tok as alternatives to each other.
A better alternative might be to simply envision and implement new regulations based on minimum user bases. If your user base is larger than X (to be defined) then you are subject to regulations. Some suggestions could be user bases larger than the [smallest or largest] state by population. A social media platform that has more influence and power than a state government seems like a reasonable target for regulation.
Facebook doesn’t prevent you from using Twitter, or Reddit or a small new network. All kinds of people can and do engage with multiple networks, big and small, at their own inclination. All of the platforms are accessed on the same internet using the same protocols and clients, and it’s trivial — completely trivial — to find alternate content or all kinds (at least in the US).
People aren’t abandoning the big platforms in droves because they generally like those big platforms. (Probably almost nobody thinks they are great, but that’s not the standard. The standard is better than the alternatives.)
As the article points out, the marketplace of ideas is leading Twitter to more flexible moderation mechanisms that let users choose their own level.
Seems crazy to change the rules, force companies to break their business models, when the system is working fine.
Doctorow (and others) may not like it, but many, many people like Facebook and Twitter a lot more than they dislike them.
Protective regulation should come into play when the affected populace is not in any position to consent to (or decline) the situation. E.g., because alternatives aren’t available, or because the issue requires special expertise to understand, etc. But generally this doesn’t apply here. (I do think there needs to be recourse for people who are banned or otherwise lose access to something they’ve invested in.)
reply