It work the opposite way too, wherein platforms lose users due to undesired changes. YouTube's banning/striking/demonitizatation of high profile channels, something that Joe often says he's afraid of, could be seen as an undesirable platform change.
Hm, I don't know. The only people I see complaining about 'losing trust' in the platforms is the very people who would be posting harmful content in those platforms.
Nobody should place all their eggs in one basket. It is unfortunate that a lot of these platforms have become the only source of revenue for a lot of people. The moment you are that tied to a revenue stream, any changes in policy that restrict the content you produce will become a death blow. People in YouTube are living the 'gig economy' from a different angle.
> it reduced circulation on the platform as a whole
So it also doesn't take into account the potential long term toxicity of driving everyone with certain views to other platforms, where those views will be reinforced. With those platforms having the appeal of their "taboo" helping to market them.
Is there any example of this ever happening? A major platform losing a non-trivial proportion of their users for not censoring them?
The network effect is strong. That's the problem to begin with. If people would quit so easily then there would be a lot more platforms and competition and we wouldn't be having this discussion.
I think there is a general issue with platforms “conveniently benefiting” from massive growth from all their users, until one day they suddenly decide who/what they want to “allow”. This exclusion can take different forms, e.g. exclusion by suddenly imposing high subscription fees (not just “moral” exclusion).
It’d be really interesting to see if these platforms would ever have gotten so big/popular without their now-excluded audiences.
I'd say that this is about the same as being de-platformed. You could argue that also hurts your reputation, but that's usually hard to do since you don't have a platform to state as much upon.
The problem with Alternative Platforms is they are often first used by people that have been banned by main stream platforms
It is sad we have come to the point that people "will not touch" a platform because there are others on their which they disagree with, or even find offensive.
It is easy to defend the speech of people you agree with, it take someone of strong ethical convictions to understand that in order to actually have free speech, the speech we find offensive must also be permitted.
Much of the original content on YT, that made YT popular would be banned under the terms YT operates under today. If you apply these conditions to potential replacements for YT then no replacements are possible
Perhaps they figured that having X number of people dislike you was better than having those same people never know who you are. I still think the whole "dying platform" thing was taken a bit far. The implication when I read it was that some of these platforms were dying platforms, not all of them.
Eh that's a bit of a different mechanism. If you bootstrap a platform from 0 users with the core concept being a lack of moderation, then you're likely to attract those who have been banned, excluded, or otherwise ostracized from existing platforms. This happens because those who support lack of moderation but still have a choice to remain are likely to remain on the existing platforms due to network effects.
Whereas if you start with the popular platform and progressively remove moderation, you end up with different effects, because you still have the core, non-bad-actor population. That is, if your signal-to-noise ratio goes down, but the absolute amount of interaction with your platform increases, it may still be worth it.
You are talking about people with mental illness. I think the platforms produce mental illness. If you influence and produce a certain kind of behaviour through appropriate reward mechanisms and then after a few years of conditioning reduce/remove the reward what do you think happens? Mental illness.
YouTube has conditioned a generation of people to chant "subscribe like and share" like a robots. The day the "subscribe like and share" model stops being viable the robots will break down.
For those who care about doing something here is a starting point - humanetech.com
de-platforming simply encourages new platforms to be built or adopted by the de-platformed, which in a way makes them stronger. it is short-sighted and long-term self-defeating. but hey censors gonna censor.
The issue is that then a lot of very toxic people will go to that platform and there will be a lot of very toxic content. Then all the people who aren't toxic will leave the platform because they don't want to be subjected to that content. Then advertisers will leave because they don't want their brands associated with that content. Then app stores will drop the app because they don't want their brands associated with that content. Then the platform essentially dies or stagnates.
The experience of people getting their business fucked by platforms when they get traction mean that less people will try it again with those new ones.
There's a great video that goes through this phenomenon (in the context of Youtube vs VidMe) [0]. It points our exactly that the people who adopt a new platform tend to be the people most toxic for the original. I have no idea how you get around this.
People will leave certain platforms if they become unpalatable. Look at the amount of outrage generated by the very small level of tampering that facebook and twitter do currently. There is a tremendous amount of scrutiny on these platforms.
There is just the problem that you spend a lot of time educating the people who are then kicked off the platform. It's not as if they disappear, they go to your competitors and improve their ecosystems, plus you are going to inspire some serious hate which will limit those who sign up to over confident assholes (who can't imagine that they would ever be kicked out) and people who need all the help they can get.
So you aren't really going to have a great ecosystem.
I think those days are sadly long past. As long as the idea pervades that opinions we don't agree with are "dangerous", I can't see platform providers not being held to task to remove undesirables.
reply