Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

The audacity of YouTube to go after conspiracy theories while actively engaged in a conspiracy:

https://www.wsj.com/articles/youtube-hiring-for-some-positio...



sort by: page size:

Wait, they're claiming that Youtube recommends a lot of conspiracy bullshit, which is pretty well documented. You're pointing out that Youtube bans one specific narrow type of conspiracy bullshit, which is true but irrelevant. I don't see how that makes their statements questionable.

YouTube is, at all times, pro-echo chamber. So this is not their motive.

Youtube suffers from the same pitfalls as FB: they actively promote tinfoil hat content, by design, because it makes money.

Removing individual videos is a red herring. They will persist to profit from promoting conspiracy content, and so they will. Cherry-picking a few things will just make this worse.

They're going this direction to pay lip service while avoiding changing their algorithms. Fuels the fire.


I don't really see why this is a conspiratorial hypotheses. The goal of youtube is to keep you on youtube watching videos with ads. That is what they are trying to do, and it happens to coincide with pushing extreme content as an unintended side effect.

While there’s no reason to believe some broad conspiracy exists, it’d be hilariously ridiculous to assume that the folks at YouTube are unaware of the power they have to influence the zeitgeist. If some number of them act to advance their individual worldview through their enforcement of policy (is there any doubt that this happens?), then it is perfectly reasonable to say that YouTube, as a result of the conscious or unconscious biases of its decision makers, advances a worldview (though perhaps not a specific narrative).

I hope RWW is able to continue on after this. They do important work.

This exposes what everyone already knows. YouTube (Google... err... Alphabet) doesn't really have a problem with the conspiracy theories and right wing propaganda. They just don't want the drama. Nutjobs watch for hours and apparently click on advertisements.


This video is at the top of reddit, but it doesn't show any real evidence of doctored videos.

Although google prevented affiliate monetisation on the video as seen on the affiliate earning graph, they probably have some intermediate step whereby they still show ads but the affiliate doesn't get paid for them.

Secondly, where the same video was shown with multiple different ads and the same view count - youtube video view counts dont increase in real time. You can push refresh on a video a few times and you will likely get 3 different ads with the same viewcount recorded.

I wish they were doctored images, but I don't think this is evidence of that.

Also, when did social justice trolling become a business model for the WSJ? First pewdiepie, now this?


I hope YouTube does not correlate criticism with conspiracy theories

youtube is offering a convenient service subsidized by googles big pockets, they are not some arbiter of truth (but are entitled to their opinions). there are already plenty of podcasts in their archives that directly contradicted WHO guidelines at the time they were made. I don't even think they are a major source of hoaxes/conspiracies. this is actually a bad look for youtube.

This sounds like a reasonable conclusion to me. Anyone here work for YouTube that can confirm/deny this?

The article pretty strongly suggests that YouTube is discriminating against RWW due to partisan bias.

Given YouTube's record of banning right-wing content, non-partisan content that doesn't adhere to some official narrative (Bret Weinstein's Ivermectin videos, Covid-19 lab leak theories, mRNA vaccine heart inflammation), incredibly inconsistent copyright infringement enforcement, and random demonetization, this claim seems to be a bit of a stretch.

If anything, YT appears to be biased the other way.


I think the reality is a lot less interesting than the conspiracy theory: mainstream media sources are far less likely to post dangerous, controversial or radicalising content. Given that YouTube has historically had a problem with that (and still does in countries like Brazil[1]) it's not so much a suspicious response as a very predictable business response.

[1] https://www.nytimes.com/2019/08/11/world/americas/youtube-br...


Without any evidence this is just speculation. The reality is, there are many popular political YouTube channels that represent viewpoints all across the spectrum and this is an easily verifiable fact. I am yet to see any evidence that YouTube is motivated by anything other than money.

“Google and YouTube don’t want to take any action against any far-right channel for fear of stoking the far right to say they’re being persecuted,”

What bizarro world is this pretending to be? A few lines before that in the same "article" they provide contradiction:

"After further outcry and investigation, YouTube later opted to demonetize Crowder’s channel, citing “widespread harm to the YouTube community resulting from the ongoing pattern of egregious behavior.”

Amazing to believe in the face all that is apparent that there are people that think there is anything in the Bay area with a "right-wing bias" that hasn't been stamped out yet.


A few thoughts on this.

Ultimately, whose fault is it that a user watches a "conspiracy" video? The article seems to consider the user base as mindless people with their mouths open to be spoon-fed content. Do the viewers have no agency?

In that thread, why is the solution to curtail "conspiracy" videos? What defines a conspiracy theory? I mean, we all seem to agree here about vaccines and moon landing, but what's to stop YouTube from labeling a political party as "conspiracy" and filtering them out of existence? Can't the users choose for themselves, or are they too "stupid" to pick what's best for them?

Yes, YouTube is free to filter as they see fit... but if they want a recommendation engine, AND they want to filter out "bad things", it's a very valid question to ask who decides what is bad. I believe it is essential for YouTube to be transparent about what their algorithms are designed to filter out to avoid this scenario from happening.

Part of the the solution that no one seems to be talking about is to caution others against passive consumption of media. YouTube et al want users to plug their ears, close their eyes, tilt their heads back, and consume. YouTube can condition their streams to be as "healthy" as they want, for whatever definition of "healthy" YouTube chooses--but there is no way to healthily allow yourself to be fed a diet of things you don't pick, even if it looks good.

"Forcing" YouTube to do this will simply result in the lame, ineffective warnings you see at casinos. "Remember! Mindlessly consume responsibly! If you need help, uh... throw away your computer I guess!" This needs to be done on a human level, thinkers to parents, parent to children. Not a legislative level.


FTFY: Youtube has decided to believe the multiple independent lines of evidence which came out of a four year investigation by multiple journalists across more than one organisation.

This is not currently a legal matter, but a matter that concerns a public figure's ethical standards. Multiple independent lines of evidence is a powerful thing.


The article implies something nefarious going on here, but there are plenty of entirely rational innocent explanations. i.e. the videos are attracting a large quantity of spam dislikes, and YouTube is bulk-removing them.

Youtube once again siding with IP hoarders, advertisers and propagandists over their audience.

Come on, I'm trying not to dismiss your perspective just because it sounds a bit conspiratorial, but this is not convincing. Are you saying the YouTube execs (or whoever is leaning on them) are just acting emotionally with no strategy?
next

Legal | privacy