Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Wait, they're claiming that Youtube recommends a lot of conspiracy bullshit, which is pretty well documented. You're pointing out that Youtube bans one specific narrow type of conspiracy bullshit, which is true but irrelevant. I don't see how that makes their statements questionable.


sort by: page size:

What if I want to watch conspiracy theories and I want them to be recommended to me? How youtube decides what is conspiracy anyway? and how do they decide that I don't want to watch them?

It's pretty straightforward: conspiracy videos generate a lot of views, which is ultimately profitable for youtube. Videos which point out that the conspiracy videos are garbage have the potential to adversely impact their astronomical view counts and by proxy their profitability. So of course youtube would ban accounts which attempt to undermine their revenue streams.

It's despicable, sure, but not at all surprising.


A few thoughts on this.

Ultimately, whose fault is it that a user watches a "conspiracy" video? The article seems to consider the user base as mindless people with their mouths open to be spoon-fed content. Do the viewers have no agency?

In that thread, why is the solution to curtail "conspiracy" videos? What defines a conspiracy theory? I mean, we all seem to agree here about vaccines and moon landing, but what's to stop YouTube from labeling a political party as "conspiracy" and filtering them out of existence? Can't the users choose for themselves, or are they too "stupid" to pick what's best for them?

Yes, YouTube is free to filter as they see fit... but if they want a recommendation engine, AND they want to filter out "bad things", it's a very valid question to ask who decides what is bad. I believe it is essential for YouTube to be transparent about what their algorithms are designed to filter out to avoid this scenario from happening.

Part of the the solution that no one seems to be talking about is to caution others against passive consumption of media. YouTube et al want users to plug their ears, close their eyes, tilt their heads back, and consume. YouTube can condition their streams to be as "healthy" as they want, for whatever definition of "healthy" YouTube chooses--but there is no way to healthily allow yourself to be fed a diet of things you don't pick, even if it looks good.

"Forcing" YouTube to do this will simply result in the lame, ineffective warnings you see at casinos. "Remember! Mindlessly consume responsibly! If you need help, uh... throw away your computer I guess!" This needs to be done on a human level, thinkers to parents, parent to children. Not a legislative level.


The audacity of YouTube to go after conspiracy theories while actively engaged in a conspiracy:

https://www.wsj.com/articles/youtube-hiring-for-some-positio...


Well, first off, YouTube is a private corporation, not subject to free-speech laws.

Second, while I agree that YouTube shouldn't ban people for believing in conspiratorial nonsense, even if I do think it's harmful, it's not unreasonable that their algorithm should bias things that are fact-based, or at least not contributing to a toxic platform. It's hard to argue that any good came from Alex Jones posting the addresses of the victims of the Sandy Hook shooting, and it doesn't create a fascist dystopia when YouTube doesn't want to promote something like that.

Also, I wouldn't be 100% sure that all the flat-earthers are trolling. There's an overwhelming amount of stupid people in the world, and while I will say that most of them are probably messing around, at least some are serious. At some level, I think it's far too easy for people to say "I'm just trolling" after the fact.


When "stupid" conspiracy theories lead to real-world violence committed by their believers, they're harmful. And there's pretty clear evidence that YouTube's recommendation algorithm, in optimizing for time spent watching YouTube videos, has as a side effect optimized for sending people to the most extreme stuff on the site. YouTube has an ethical responsibility to, y'know, do something about that.

The reason they banned youtube is because the evidence was on youtube. You need to re-evaluate your assumptions.

youtube is offering a convenient service subsidized by googles big pockets, they are not some arbiter of truth (but are entitled to their opinions). there are already plenty of podcasts in their archives that directly contradicted WHO guidelines at the time they were made. I don't even think they are a major source of hoaxes/conspiracies. this is actually a bad look for youtube.

The recommendations determine what people see. People in this thread are asking youtube remove content from their feed that does not tailor to their worldview. The problem is that the worldview many young people have is highly dictated by the government and any serious accusations toward the government are labeled as conspiracy. Conspiracy is auotomatically associated with insanity. The reason I use an extreme word like brainwashing is because many people have been given such a strong bias that it is impossible for them to accept the possibility of serious criminal immoral activities by their government.

No; rather YouTube has taken it upon themselves to decide what the truth is and censor views they believe are wrong. e.g. see this news article: https://www.androidheadlines.com/2020/04/youtube-ceo-coronav...

--- (Snippet from the article) ---

YouTube's response is now to simply remove videos containing misinformation while previous policies have seen most related content demonetized. Examples provided by the executive include videos claiming that people can be cured by taking vitamin C or turmeric. Neither has been proven to act as a cure according to the wider health community.

Another example of prominent videos that are being removed, she continues, are those related to 5G as an underlying cause. The policy changes, like the rise of those conspiracy theories, have had to be rapid. As a result, for the time being, Videos that contain claims in direct opposition to information provided by WHO will be removed as well.

YouTube hopes that by removing conspiracy theories and misinformation, it can help keep users better informed.

----

While it may squash some of the stupider and more dangerous ideas floating around right now, it tosses the baby out with the bathwater and harms important discussion about whether those in charge right now do actually have their information right.


No, you're allowed to talk about whatever but people think YouTube has a responsibility to stop the spread of things that have been shown to be false and of which there is zero evidence dispite lots of searching for evidence.

I don't believe that. They could have restricted this to problematic videos which promote conspiracy theories and their likes.

This reflexive “YouTube can ban whatever they want” is just a way for people to stop thinking about the issue to avoid cognitive dissonance. Aside from a few hard-headed libertarians, no one who says that believes it in any other context aside from censoring conservatives.

How so? They are changing how recommendations work, they aren't removing videos based on agenda. I am also wondering how the brainwashing comment comes into play here, just because people do not believe in a conspiracy theory doesn't mean they are brainwashed. I also do not remember ever asking to be censored.

Youtube suffers from the same pitfalls as FB: they actively promote tinfoil hat content, by design, because it makes money.

Removing individual videos is a red herring. They will persist to profit from promoting conspiracy content, and so they will. Cherry-picking a few things will just make this worse.

They're going this direction to pay lip service while avoiding changing their algorithms. Fuels the fire.


> For right-leaning users, video recommendations are more likely to come from channels that share political extremism, conspiracy theories and otherwise problematic content. Recommendations for left-leaning users on YouTube were markedly fewer, researchers said.

This depends on the researcher's definitions of 'extremism' and 'conspiracy theories'.

- Recently we've seen many left wing people state that disassembling people in front of their families - surely an 'extreme' act - is a 'beautiful act of resistance', and that calls for genocide against Jewish people (surely also 'extreme') may not constitute hate speech in some contexts.

- For the last 7 years we've had many people believe in the Russiagate conspiracy theory.

- I'm not sure "problematic" has any real meaning.


YouTube censors content all the time. Pornography and pirated content, being just the most obvious examples. Others that want censored content have only to look to other sites to find that content.

In my opinion, YouTube should (and does) have content standards, and be extremely free in how they determine them. People certainly have a right to publish and consume conspiracy theories, but I don't understand why YouTube has anything close to an obligation to host or promote them.


While there’s no reason to believe some broad conspiracy exists, it’d be hilariously ridiculous to assume that the folks at YouTube are unaware of the power they have to influence the zeitgeist. If some number of them act to advance their individual worldview through their enforcement of policy (is there any doubt that this happens?), then it is perfectly reasonable to say that YouTube, as a result of the conscious or unconscious biases of its decision makers, advances a worldview (though perhaps not a specific narrative).

It's disturbing that YouTube has the capability of mass banning a certain point of view with just a few keystrokes, it sounds like. Wasn't built for nothing.
next

Legal | privacy