> If that's true, then is it not better the tweet stays up and is challenged?
I used to agree with this line of thinking, but I believe the past dozen years or so of the information age have made it abundantly clear that simply having the truth on your side and making a rational case for it is not even remotely sufficient to counter the spread of misinformation.
Honestly, I'm not sure that free speech as a concept will be able to survive the information age. It seems as though any society holding it sacrosanct might be doomed to collapse from divisive pressure both internal and external. I hope I'm wrong about that, but only time will tell, and interesting[0] times they will be.
[0] In the apocryphal chinese curse sense of the word, if that wasn't obvious.
I used to have a very similar opinion to yours until I found out the truth rarely fights for itself if there's an alternative, much more convenient lie — preferably one that plays into peoples' confirmation bias or makes someone richer.
We've got an aysmmetry here that never existed before in human history. Any kind of large scale public addressing: newspapers, TV, books, had some kind of controlling mechanism and instances who stood in for the quality with their reputation: Editors, printers, TV stations.
All legislation, including the First Amendment (which, just picking nits here, is probably too simple to slap onto YouTube as a global content platform as a US-only law) was written regarding these conventions.
YouTube goes out there and hands everyone a big fat mike who wants to be heard, and then amplifies this voice to believers via recommendations.
I used to think the Internet would be a net benefit to the world. After witnessing Cambridge Analytica, the "Internet Research Agency", large scale online harrassment and Anti-Vaxxers, I'm not so sure anymore.
I don't claim to know what the answer is.
I just claim I'm pretty sure it's not the unbounded freedom to anonymous and partly highly maliciously motivated actors to spread their message without oversight.
> Also, disinformation campaigns in the "West" have always existed. It's not anything new.
True, but what is new is that disinformation campaigns can utilize popular online platforms to communicate directly with susceptible individuals on the scale of billions of people. That has never existed before and private companies are clearly struggling to address it.
> Free speech is what prevents disinformation to have total control.
Unfortunately, disinformation doesn't need total control to be able to hijack a democracy, it just needs to be effective enough to keep the electorate wanting to support the officials that disinformation supports.
To be clear, I'm very much on the fence on how we should move forward on this issue. But we now clearly live in a world that no longer has a commonly shared truth. One's perception of truth and belief in leadership style, or even the existence of criminal activity depends largely on your political background.
>I suspect a similar story can be told about our slide to defaulting that free speech is bad, that the status quo should be the priority, and that perfect is preferable to good. These are mistakes, even as they are understandable. After all, misinformation is a bad thing, change is uncertain, and no one wants to be the one that screwed up. Everyone has good intentions; the mistake is in valuing intentions over outcomes.
This, I think, is the flaw in this argument. People do not always have good intentions, and they are polluting the informational landscape to the point where it's dramatically impacting society.
The principles he's asking for are indeed happening right now, but in a nefarious way:
"Free speech is a good thing" - Sure, but that means 4chan, QAnon, anti-vaxxers, and the like also get loud, free speech now.
"Status quo is bad" - The previous status quo of large, centralized media groups is gone, and with it basic levels of journalistic ethics and transparency.
"Move fast, value experimentation" - Changes are happening to the way information is produced disseminated much faster than societal constructs can keep up. I didn't have to worry 20 years ago about my relatives treating "Hillary Clinton runs a sex dungeon in a pizza shop" FB memes with the same weight as NBC Nightly News.
Fixing these issues, whether by government, tech industry, or otherwise, will involve some level of "authoritarianism".
Fair enough, but that is also confounding, maybe just as much - technology didn't cause more freedoms or liberties. Individual rights and equality is also the result of the striving towards the ideal of freedom and liberty.
>You simply cannot compare our current problem of echo chambers and fake news to the 1930s problem of nationalist propaganda or pre-renaissance problem of the church controlling which information is heresy.
That's a very premature conclusion. We're in the middle of a shift in the information landscape and we can't really predict which ending this will lead us to. We may very well be on the path to another dark era in the history of humanity rivalling fascism or the inquisition.
That's because, in itself, information and the flow of information (as dictated by
contemporary technology) is just a force multiplier. It allows us to organise more effectively removing barriers to action. What we do with it is another matter.
Just because we have more of it or do it better doesn't mean it is a unequivocally positive force. Looking at the current trajectory on the use of data, there is the potential for it be to even more destructive force than your examples.
>> If this is the endgame, it's good that computers exist.
>> Without Twitter, the public might believe absurd lies.
>> Nowadays, the lies would have to be pretty nuanced to
>> have an effect.
I completely disagree. Just look at the presidential candidates: someone who is regarded as a serial liar, someone who wants to give everything for free yet is ridiculed for not understanding economics, a billionaire who takes opposite sides of issues even on the same day, a brain surgeon who is forgotten in the polls, etc.
People believe many things even without a single shred of proof. "Crowdsourcing of truth" should be the meme of Twitter.
> Instead, I think that regular people’s writings on the Internet is hurting the world on a bigger scale. And the collective sentiment is often manipulated by some “agitators” that are exploiting anonymous online speech for their own agendas: that includes online militias–for example sponsored by foreign governments–whose goal is to destabilize a society.
People seem to think the censoring ideas makese them go away. Nothing can be further from the truth. It just moves it elsewhere. If people are motivated they will find a way to get their message out there.
> In the last few years, completely unregulated online speech has given rise to fake news and conspiracy theories that have actually killed people. It’s offered a megaphone to those promoting dangerous ideas like white supremacy, Islamophobia, anti-Semitism, homophobia and other anti-LGBTQ positions, and sometimes outright Nazism. It has tilted many democracies towards right-wing populism and fascism.
Anti-semitism has been around 100s (if not 1000s) of years. Nazism was most prevalent in the 1930s to the mid 1940s. Homophobia and anti-LGBTQ views have been existed since the beginning of time I would wager. The last time I checked that was well before the internet was around.
AS for the the move towards populism (by the way there is left wing populism as well) and facism is because politicians don't address valid concerns around thorny subjects like immigration, whereas people outside the overton window will talk about those subjects. This isn't a problem with free speech it is a problem with these issues not being discussed and addressed by those in power in a serious manner.
As for propaganda (which is what meant by fake news) has been around since pen was put to paper.
> Second, while almost everyone in the communities supporting a distributed web are good people, with good intentions, seeing some names in there is concerning to me. Regarding IPFS, advocates (at least for a while) included people like Nick Lim of BitMitigate and VanwaNet, companies responsible for rescuing, among others, pro-nazi website The Daily Stormer and the platform 8chan, a cesspool full of Nazi propaganda, child pornography, and other hate speech.
This is the same crap old crap from governments and traditional news outlets to justify spying on their citizens and other privacy violations, which I am sure many people on here are opposed to.
> Gatherings on 8chan have been blamed for at least three mass shootings in 2019 alone, including the one in the mosque in Christchurch, all of them motivated by racial hatred.
The same can be said about more mainstream websites and communication services. IIRC the Christchurch shooter streamed the shooting on facebook.
> The first real examples of the distributed web aren’t particularly encouraging either. Among some of the most popular apps (“popular” in relative terms, of course) for the distributed web is DTube, a sort of YouTube that is built on top of IPFS. As you can expect, the website is full of questionable content, including conspiracy theories, cryptocurrency scams, weapons, RT International’s Russian propaganda… and of course, porn.
Not porn! Porn has never got onto the web before, we better stop this right now. /sarcasm
People used make the same criticism of large successful services we have today like Youtube, Reddit, Vimeo by saying the content was trite or dangerous. They used to same thing about novels in the 18th century.
-----------
Ultimately his lamentations will be meaningless. Someone will pick up where he left off and continue the effort.
>I think the onus is on those who would silence debate to prove the immediate harm they are advocating isn't the worse alternative.
We've always relied on editorial control for the most part in all of our mediums to make sure that the information being disseminated is reasonably accurate and fit for public consumption. It's not a fascist idea and has absolutely nothing to do with fascism or any type of propaganda for any political party. There's no need to be so absurdly hyperbolic over what is quite frankly, a common sense mechanism when dealing with the proliferation of ideas.
We know that disinformation spreads faster than the truth on several new, internet media platforms these days. The onus is on the people who would so eagerly disregard common sense filters that have been time tested, for them to prove that the danger and harm currently being caused by this new wave of disinformation in every single subject matter will be worth it now, tomorrow, and for the foreseeable future. Take the following:
* Vaccines, health.
* Environment
* Economics
* General politics.
In which subject are the ideals your espousing helping us? Because in each of those subjects I can point you towards real life, damaging consequences that have come about because of unfettered Bad (TM) ideas that spread over modern mediums.
This isn't a theoretical problem. It's happening right now, today.
> What we saw today was precipitated by mass deplatforming and censorship
Nah, social media was the hub of misinformation and was abused and here we are. What they eventually did was too little and too late. There was a lot of fake news on FB, Twitter, YouTube, Reddit etc. that radicalized all these folks. Platforms like YouTube amplified it because it resulted in more views.
>When you take away the ability of people to participate in peaceful political discourse, you push them towards violence
Nah, the violence is mostly because they believe the election was stolen, based on fake news spread originating from 4chan and then spread on social media and high level politicians.
> You can't make people's beliefs go away by silencing them
But you can restrict the spread. No one objects when phishing scams and fraud targeting the elderly and others are blocked by social media and email platforms, why is there an objection when bullshit qanon and 5g conspiracies are blocked?
There was hardly any blocking on social media till about 2017, and things got way worse with these folks as the right wing and alt right abused the platforms. And now folks like you claim things will somehow get better if fake news is allowed to be spread unrestricted? I don't buy it one bit.
> I don't mind the flat-earthers and the climate-skeptics and the crazy UFO tribe as long as I can hear different voices.
Will you still not mind when all the different voices are saying things just as false as those?
The fundamental problem is that generating false information is easier, faster, and cheaper than generating true information.
It used to be expensive to publish information once you generated it and the channels that could spread it both widely and quickly were particularly expensive and there weren't many of them. The more affordable channels were much slower.
It also used to be that long distance person to person communication was expensive. Most social communication was with people in your area.
Both of these did a good job of countering the ease, speed, and cost advantage generating misinformation has over generating accurate information.
Now neither of them holds. False information can easily completely swamp true information for people who are getting most of their information from their social media feeds, and that is a lot of people.
> the people most susceptible to lies and propaganda are willingly seeking it out. That’s always been a problem but social media has made it orders of magnitude easier to discover.
What are you basically saying is: democracy (which is impossible without free speech) is not able to work after social media is invented and deployed, because some people are susceptible to lies and propaganda. I kinda doubt the truth of such a radical statement. Democracy is not an ideal thing, but look what happens with alternatives - when some wise men decide what is better for those susceptible (and all others too).
> I hear this too-much-free-speech argument a lot these days, but I can’t get used to it.
That has been my feeling for the last decade. When I grew up, there were all this talk about how Internet would be a truly free medium of information exchange, and what immense benefits this will bring to humanity. Usually, there's generations between new things appearing and their effects being seen, and one can only read in books about what your ancestors though about this new thing and marvel about how wrong they are. Now, we live in the happy time where it takes mere decades.
And so, 30 years after, the Internet did change the world. And it didn't. We got the awesome freedom of information - and we've got the usual suspects gnawing at it from all directions, and unlike what early enthusiasts predicted, largely they succeeded. Some expected that. What they - and myself - didn't expect at all that formerly "progressive" movement, academia and large segments or highly educated and cultured people will join the most oppressive governments in calling to reign in free speech, restrict the freedom of information flow, and deny people with non-mainstream opinions access to free speech, and threatening with vicious harassment and prosecution anybody who dares (or ever dared) to voice a controversial opinion.
I remember when freedom of speech debate was about "should we allow the (actual) Nazis to express themselves publicly" and the answer was "we don't like them, but yes". Now the debate is "should we allow people who support a mainstream political candidate to listen to what she has to say within 10 miles of where we are" and the answer seems to be again and again "we don't like them, so no". And it's even worse on the Internet, because it's not 10 miles, it's everywhere. What happened to all those free speech supporters? Did they get old and die? Did they only support free speech because they didn't think nobody would say anything they don't like? I can't believe they were so stupid as to think that. So what happened?
> Nobody - no group, movement, person, or authority owns the truth.
Putting truth itself on a pedestal doesn't really address the problem.
The problem is people are motivated by things other than a good standard of truth most of the time. Social media provides a medium where it is easy for emotion can rule over reason and advertisers profit off of it.
> If people use the internet to say the most outrageous things then do be it.
Let's say everyone online starts calling you a pedophile because a soon-to-be-ex-wife of yours decided to try to cause you harm. What then? Is posting your address next also free speech?
The issue is not the truth, or access to it, but the fact we have large numbers of people that act on things less than the truth and a culture/economy that seems to perpetuate that for those who aren't interested. Putting these people in the same network as people who really want to talk about and debate ideas isn't working.
There are standards of truth but people that aren't ruled by reason won't obey them. They'll go with their gut, or what their friends say, or what they think they should say to impress their peers, or what everyone else is doing. For people who over their lives have been concerned more with survival or overcoming trauma than a real exchange of knowledge or ideas, often these strategies work well.
> Couldn't someone have said the same thing of the printing press in the year 1440?
Yes, and they did. Society went through a period of disruption and experimentation and came out on the other side with new institutions and cultural norms.
The same is happening here. That suggests what's considered acceptable for pamphleteers and public speakers may not be acceptable on an auto-curating micro-targeting real-time nonlocal platform like Facebook.
There was a balance between free speech and the common interest and it's been disrupted. Holding as absolute policies and norms from mass broadcasting in the era of social media is delusional.
> Of course, this can’t go on forever. Eventually people abandon those polluted streams. That’s what will happen after the Age of Information crashes.
Strong claim with a stain of nuisance.
I would love to call it bs, because the mass internet is still relatively young and our society still has momentum in taming this medium.
On the other hand though, religion is much older than any medium and still alive and kicking and the enormous amplification by the internet to manipulate useful idiots to keep the status quo for our grande profiteurs is out of question.
The internet is usefull, not just for our information/nonsense/tribal addicted brains, i think, we won't overcome it either way.
>This is problematic in ways that used to be obvious to people in free societies, but for some reason seems lost now.
I don't think this is true. I think some people always got it, and some people still do. The difference is that the internet allows anyone to post their opinions, but it used to be a lot harder to reach other people.
We still have people that we hold up on pedestals for saying the right things, and the people that we remember from the past are the ones that said things that were incredibly right, or incredibly wrong. We don't remember what every common Joe used to say daily.
>It seems like we're moving in the opposite direction these days
Good.
>with the internet as a means to control, oppress, and indoctrinate.
No, we're just not allowing "information anarchy." Not all information is beneficial, useful, or constructive. Some of it has net negatives.
Stop assuming there are enough rational/good actors to counter act the bad/ignorant actors who propagate bad information. We don't have the public education infrastructure in a good enough place to assume that. So stop assuming it, because the data is telling you you're wrong. We're seeing the opposite of what everyone claimed would happen if everyone had access to as much information from as many different sources as possible. And we've just tip toed into this era and we're already seeing bad effects.
The internet of yesteryear (and I mean decades ago) was largely populated by well educated people. Most of them were educated in a fairly structured environment, and knew how to qualify credible sources and credible information against information that was most likely incorrect or uncredible. That's not indoctrination. That's not "oppressive." The exact opposite paradigm you want for the internet created a populace that allowed the internet to be a great place to have civil, intellectual, and life altering conversations. Someone saying something absurd/incorrect/dangerous never could go far. It was always beaten or it was ignored, left to die because there wasn't an audience for those ideas.
Now you can present almost any idea, no matter how ridiculous, harmful, or "bad" it may be and there will most likely be audience. And that's a problem. The idea that again, we can just let ideas fly off the wall and it'll slowly get filtered out is not tenable and has every ounce of data loudly saying this perspective is not true.
You might be well educated enough, or even just have the intelligence, to discuss certain topics in an intellectually curious manner from a distance. But others, a lot of others, are going to take them at face value and run with them.
It's information anarchy. Anarchy always results in new power structures that are less suitable to civil existence.
>The current situation in the US is challenging. Misinformation, disinformation, and conspiracy theories are rampant in every cohort of society.
Most of the higher-quality longitudinal evidence I've seen does not show that there is an increase in believing in conspiracy theories or misinformation (or, as we used to call it, people simply being incorrect). It's simply much more visible now due to the Internet being widely adopted, whereas in the past you'd never know what people across the world (or even your own city) may have been thinking about.
Since there is no stable nor global solution to the problem of deciding who gets to label/regulate information, nor any evidence showing that even with the regulation of information, people actually change their beliefs to the 'correct' ones, I don't personally see merit in this path. The temptation to find a solution is sensible prima facie, but history and human psychology show us that this is a fool's errand.
> My point is society has already gained all value that can be extracted from free knowledge and information.
This doesn't really follow. Just because the information is available doesn't necessarily mean society will instantly absorb it or trust it.
Right now we're in the early stages of the information boom, where there's a lot of information floating around, but it's difficult to validate that information and determine whether or not it's accurate. Hopefully in the coming decades we as a society will figure out better means of such validation and start to actually address the problems around "fake news" and other sources of misinformation.
> A free and open internet also allows propaganda and hate to flourish
I disagree. Sure there will always be some amount of misinformation on the internet, but a fully state-controlled media and internet (see - China) is far, far more effective at spreading propaganda and controlling the narrative.
I used to agree with this line of thinking, but I believe the past dozen years or so of the information age have made it abundantly clear that simply having the truth on your side and making a rational case for it is not even remotely sufficient to counter the spread of misinformation.
Honestly, I'm not sure that free speech as a concept will be able to survive the information age. It seems as though any society holding it sacrosanct might be doomed to collapse from divisive pressure both internal and external. I hope I'm wrong about that, but only time will tell, and interesting[0] times they will be.
[0] In the apocryphal chinese curse sense of the word, if that wasn't obvious.
reply