YouTube isn't an independent "company" in the strict sense of the word (even if it has its own legal entities), it is just a division/product of Google, a subsidiary of Alphabet.
It's impolite to demand personal responsibility from corporations. It was the 'United Fruit Company' that funded dictatorships, and 'Comcast' and 'Disney' that bribed for favorable net-neutrality/copyright laws, and 'Monsanto' and 'Big Pharma' that extort people by abusing and writing in their favor the drug regulatory framework.
Not any CEO or major stockholder with a name and address, goodness no.
A few theories. (From the simpliest to the more complicated)
1. The person who wrote this article is very uneducated about this and they're showing their insecurity.
2. They're trying to defer blame of the problem to a pseudo-anonymous group
3. They're trying to avoid putting blame on Mrs. Wojciki due to personal views on identity (identity is a relevant point as that they have a massive bias on what identity and content they want others to find successful)
I'm going to go with #1 because they're pretty preachy.
I don't think the logic here holds up, and the issue is the author's ignorance of the technology and dataset involved. It's very easy for Google.com to show non-toxic results in its Video tab; it basically limits itself to retrieving results from sites humans have whitelisted for inclusion in Google News. They don't whitelist QAnon crackpot sites, so naturally you'll never see those results come up.
YouTube, however, is a completely different paradigm - no human pre-filtering happens before content goes online, and its _point_ is to search a catalog of millions of amateur-produced videos. That's very different from searching a few thousand media orgs and their indexed results. Hell, if you only got the results of the media orgs (from their YouTube channels) you could reasonably conclude that YouTube search is broken.
My conclusion is that there's no technology solution being held back from YouTube; it's just that Google Video tab just has a completely different point to it.
Consider the possibility that turning Youtube into Google News is the goal of the people who talk like this.
The internet democratized publishing. That made a lot of people unhappy - not just because of nonsense like Q, but because it let people say they weren't too keen on Communism in China, Ben Ali in Tunisia, or Clinton/Macron-style progressivism in the West. China went and un-democratized it, and a lot of people want the same thing to happen here.
It's like how social conservatives wiped out queer communities in places like LiveJournal by targeting NSFW content - they can't come right out and say they liked it better when the only easily accessible sources of information were two local papers and three TV channels, all of which could be depended on to (for example) come out in favor of the war in Iraq, just as the social conservatives couldn't say they wanted to wipe out the fags. So they're targeting conspiracy videos instead.
I think it is a matter of target metric. You can optimize just for click-through/watch time (short term reward), but you can also target a composite of that and quality/trustworthiness (long term reward). YouTube seems to mostly optimize the former, while search seems to compromise between both.
Do you have any basis for claiming that Google Video Search only shows whitelisted videos? I thought only Google News was limited that way.
I did a few quick searchs on the video tab, and pulled up a lot of videos from YouTube and at least one from Know Your Meme and another from Giphy. None of them seem like obvious whitelist candidates.
This is nothing compared to what they serve to kids. I long for the TV days when it was safe for kids to watch public TV or PBS. Google is single handedly responsible for fast evolving humans to a time when kids are no longer safe (and there's no oversight assurance by any entity except for the parents themselves).
The algorithms aren't complex: People who watched video A and also watched video B and were very likely to comment and interact with it, therefore people watching video A should be presented with video B.
It is not making a judgement on video A or B, just that it has noticed the relation. Of course, we now all understand that the most engaging type of content on the internet is the most offensive, negative, and aggressive content. This is no different than much of what is shown on TV, where vast quantities of it are pure garbage, including much of what is shown on the "serious" news channels.
Too frequently, the people writing about this and the politicians talking about it really just want to be the ones to set the line on what is "toxic" and "offensive" and have a very strong desire to influence what Google and these other companies should censor. There's also a very strong contingent of the old guard media that wishes to be restored as gatekeeper and view this as their opportunity.
Good point. It does seem like these people just want their "toxic" videos visible and the other side's "toxic" videos censored. At the end of the day, it's all about control and propaganda.
How about hmmdaily and the people who are against "toxic" video just put out interesting videos? Why not compete instead of trying to censor everything?
Also, I've never heard of hmmdaily. If anyone knows what it is and what their agenda is, I'd love to learn more about it.
For the same reason fast food wins in a competition with healthy food.
Its easy to eat, easy to find, it surprises you not at all, and its very bad for you.
There are not two sides to every issue, especially when it comes to espousing conspiracy theories, and fake news.
Pretending its about "why dont we just compete in the marketplace of ideas" is naive at best.
Why does YouTube think I want to watch "Student Shuts Down Social Justice Warrior College Professor"?
I'm conservative(ish) in a fiscal sense, but don't read conservative news (because it sucks), don't search for conservative topics (nor really any political topics), and am not interested in any alt-right or political conflict type videos. Yet a smattering of this kind of thing often winds up in sidebar suggestions.
I can't help but feel it's a bit of a fishing expedition to determine political alignment. Probably paranoid though and it's more like conflict gets eyeballs to serve ads to. Which is a statement on human nature more than Google. Who knows, but it is annoying.
>I can't help but feel it's a bit of a fishing expedition to determine political alignment.
What's frustrating is that I like to get exposed to a variety of viewpoints, but YouTube very aggressively (as far as I can tell) puts users into left/right boxes, as opposed to just serving content that is historically or politically themed.
I'd be very surprised if it turned out YouTube was NOT intentionally pushing people to anger-addictive extremist content, especially right-wing content, because it's a product that encourages the viewing habits their metrics prioritize.
Recommendations are not based on political alignment, they're based on how interests cluster. If watching a video on Austrian economics is correlated with watching "Louder With Crowder", that's what they'll recommend to you. There's no intent behind it.
> Too frequently, the people writing about this and the politicians talking about it really just want to be the ones to set the line on what is "toxic" and "offensive" and have a very strong desire to influence what Google and these other companies should censor.
I've definitely seen this pattern from people. It's rarely overt, but I do see demands that others "punch down" or that "toxic" things be treated a certain way. This is always coupled with an implicit assumption that there's some clearly understood definition of what those mean that just happens to be exactly what the person using those idioms would offer.
Like you, I think this pattern suggests people trying to grab for the privilege of writing definitions.
>People who watched video A and also watched video B and were very likely to comment and interact with it, therefore people watching video A should be presented with video B.
I don't think this is how YouTube's recommendation algorithm has worked in a long time. I won't argue that the current recommendation algorithm is somehow nefarious in nature, but I'm willing to bet there are factors that are weighed a lot heavier than the fact "I watched A so I should watch B". Your recommendations hardly change based on what video you are watching at the current moment.
My personal theory is that YouTube overemphasizes watch time to a point where any long video gets suggested as long as it has a high completion time. You could be "batshit liberal" and still be stuck in the "alt-right recommendation loop" simply because YouTube thinks you will finish those videos regardless of your political orientation.
>My personal theory is that YouTube overemphasizes watch time to a point where any long video gets suggested as long as it has a high completion time.
So the key to crowd-killing the Alt-Reich on YouTube then, is to have large numbers of people find and open up their videos, let them play for a couple of minutes, and then dislike, and leave the page?
If YouTube were suddenly banning all LGBT content so as to protect children or some such thing, wouldn't we call that censorship? It's a ludicrous idea that censorship must always come from a government.
As a private company, YouTube can choose what it wants to host and/or promote. No one is preventing the purveyors of toxic videos from making and distributing them on their own.
Definitely with limitations. For example a cake maker can be compelled to bake a cake for a gay wedding. You don’t think a private company can be compelled not to censure?
Censorship can come from anywhere. But I can choose to visit YouTube, FB, et al, or I can choose to go to a different platform, though it may not be as populated or as polished as the former. It's much harder for me to choose a govt which doesn't censor or enforce censorship.
The govt controls many aspects to out lives.. The police can come to my house and take away my children if I tell them being gay or transgender are mental health issues. YouTube can only kick me off for stating such.
The govt also can censor based on which grants out gives funding for. Another reason academia is broken.
It is censorship. They do it of their own volition, without the government forcing their hand, and it's well within their right to do it. But it is, by definition, censorship.
What it isn't - however - is a violation of a very specific legal rule regarding free speech that often has the misfortune of being conflated with the entirety of free speech as a principle.
Yup, and it's doubly hard because the people that want the censorship don't want to admit there are no objective standards and it's all a question of preference.
You are missing the point of the piece. The author is saying YouTube has the ability to not show content that is flat-out wrong and dangerous. The author is saying the same thing you are. The algorithms can be easily adjusted - but they wont be, because controversy = time on Youtube.
They absolutely do not have to recommend content that would radicalize a man enough to take a rifle into a pizza parlor to free alleged child sex slaves.
This is not about setting the line about what is "offensive." this about taking responsibility for providing a huge platform for dangerous ideas to spread.
No Im not advocating for thought police but I really do think most would agree that platforms have to start taking responsibility for promoting QAnon style conspiracies and white-supremacist ideas.
Sorry, I'm surprised how complacent everyone here is. Yes, the internet (and social networks in particular) have been able to democratize publishing and journalism like nothing comparable before - this is a good thing. BUT this advantage is being turned into a horrible disadvantage when that same publishing is causing people to lose touch with reality.
If the end result is that we'll get flooded with information, but most of it is meaningless noise then we will have less freedom than before because we cannot be sure the any information we perceive is trustworthy. You cannot seriously want this.
> but most of it is meaningless noise then we will have less freedom than before
What does this mean? In concrete, practical, down-to-earth sense?
> we cannot be sure the any information we perceive is trustworthy
This point was already made ages ago. Most famously, by ancient Greek skeptics, and by Descates’ cogito. We live in a world where anything may turn out to be untrustworthy. But we somehow cope, and chug along just fine.
"[...] But in the current, digitized world, trivial information is accumulated every second, preserved in all its triteness. Never fading, always accessible. Rumors about petty issues, misinterpretations, slander... All of this junk data preserved in an unfiltered state, growing at an alarming rate. It will only slow down social progress, reduce the rate of evolution. You seem to think that our plan is one of censorship. What we propose to do is not to control content, but to create context. The digital society furthers human flaws and selectively rewards development of convenient half-truths. [...] You exercise your right to 'freedom' and this is the result. All rhetoric to avoid conflict and protect each other from hurt. The untested truths spun by different interests to churn and accumulate in the sandbox of political correctness and value systems. Everyone withdraws into their own small gated community, afraid of a larger forum. They stay inside their little ponds leaking whatever 'truth' suits them into the growing cesspool of society at large. The different cardinal truths neither clash nor mesh. No one is invalidated, but nobody is right. Not even natural selection can take place here. The world is being engulfed in 'truth.' We're trying to stop that from happening. It's our responsibility as rulers. Just as in genetics, unnecessary information and memory must be filtered out to stimulate the evolution of the species. Who else could wade through the sea of garbage you people produce, retrieve valuable truths and even interpret their meaning for later generations? That's what it means to create context."
–"Colonel" AI, Metal Gear Solid 2: Sons of Liberty (2001)
I think the problem lies with the determination of what is and isn't objectionable or dangerous. Who decides? Tech megacorps?
There is plenty of content that we can all agree on being dangerous or worthy of banning, but there is plenty that is not settled, plenty that can blow in different directions with each fresh gust of the political winds.
Should this be handled legally instead, so citizens have recourse?
What happens if the consensus on approved/banned content among the censors is itself dangerous or at odds with reality?
Who is in control of the censorship and what is being censored, what is to be censored, at the biggest, most influential social media venues is an issue that cannot simply be waived away as 'corporate discretion'.
How long until we are all couching our Chinese social credit?
It's never them, it's always the "other" people that need to be "protected" from "toxic" content.
Having said that, fake news and conspiracy theories have played a major role in setting the groundwork for the Holocaust, so there's a valid argument to be made there. I wouldn't want the people writing/reading Vox to make the call though.
It's easy to identify when others repost propaganda. It's hard to identify when we ourselves fall for it, because by definition, we were fooled.
Just as most people believe they are above average drivers, most people probably believe they are above-average at filtering out internet propaganda. Even in a media-savvy forum like HN, I would be willing to bet that most of us, myself obviously included, have at least liked/up voted/whatever'd a modest number of maliciously spread memes/posts/etc.
YouTube is extremely popular with children and adolescents who generally lack developed critical thinking skills and the context to evaluate the claims in these videos.
>You're assuming that the idea is so sticky that you can't avoid it if you hear it.
Often times, it is.
Many people who fall into conspiratorial thinking believe themselves to be the victims of elaborate campaigns of propaganda, mind control, gaslighting, etc. Many also reject the mainstream rationales behind arguments against their beliefs as being fallacious or misinformation, and interpret attempts to convince them otherwise as validation of their fears.
Such people don't arrive at their beliefs by logical inference, their conspiracies serve an emotional need, similar to belief in cults or extremist religion. It can be comforting to "realize" there are puppetmasters pulling the strings behind the mundane chaos of the world, and that you alone can see the facade. Even if those puppetmasters are malicious, you know they're there, you know there's a plan.
Conspiracies are about making sense of a senseless world, and that's why it can be difficult to turn people away from them. No one wants to live a meaningless life in a world of arbitrary and ultimately purposeless cruelty.
Even if it were easy to add that to school curricula and teach it effectively -- which it isn't -- Youtube reaches a lot of very young children who wouldn't have received that education yet. Many of them are too young to even have the necessary cognitive skills to recognize and appropriately respond to false information.
Flat earth is so obviously wrong that I'm safe from it.
I'm not arrogant enough to think that I'm immune to misreporting of facts, or that I and millions of others couldn't be damaged by people arguing in bad faith.
And "arguing in bad faith" covers a lot of the most objectionable content.
Even if I agreed with you that all of the thoughts/propaganda/ideas that a plurality of the current population thinks should be suppressed are objectively harmful and absolutely should be suppressed (which, incidentally, I don't), even you ought to be able to agree with me that, at some point in the future, an important idea or viewpoint could find itself in the censorship list that this line of thinking lays the groundwork for.
You are reading way too far into my post, which is only about whether some videos are harmful, not what to do about it, and not how you determine which ones.
Its not "obviously wrong" for flat earther. It always depend on the crowd
Same here in this thread and even top comment: HN crowd so its "obvious" that we should ban alt-right and "people like video x or y because it is controverial"
Well no. Some people just thing its ok and want to watch it and the line between fighting hate speech and censorship is very hard to draw
>Why do you care that they (or any other weird conspiratorial videos) exist at all?
They think their judgement is better than people they disagree with. Therefore someone should exercise paternalistic control over those they disagree with, so those wrong people will stop disagreeing with their obviously-right opinions so darn much.
They also don't foresee the resentment this will generate among the scolded and/or think they can contain that resentment. Which tells me their judgement was actually untrustworthy all along, and regardless of the advisability of censorship they should not be allowed to wield it.
Is there any mention of "Wojcicki" in the Vox article behind this? Nope.
How about the NYT article from March that they link to? Nope. No "Wojcicki" there either.
Youtube remains a CEO-less company.
reply