Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

>I kind of like it, because it's given me a window of "the other side" of my social media bubble. I think the thing Facebook is particularly nefarious at is that, the way the feed works, it makes it seem like of course everyone agrees with you, except that idiot crazy uncle who you eventually have to unfriend, because all you see are seemingly "random" posts that enforce your own viewpoints.

True, though I think the same thing applies to Twitter and YouTube. You see what seems like random things from a huge variety of random people that all happen to affirm your worldview.

I believe Jack Dorsey has mentioned some vague plans to try to address the bubble problem. And I believe I may have been temporarily placed into a YouTube beta test where they tried to include videos unrelated to things I've seen before (I seem to recall an explicit notice about this, and a request for me to give feedback). So, I think they're probably trying, kind of. (I don't use Facebook and never have, so can't comment, there.)

reddit, as far as I know, doesn't do user-specific recommendations in their submission sorting (besides what they choose to be default subreddits), so you have to go out of your way to form a little bubble for yourself. Which many people do, but at least they have some awareness that they're explicitly setting it up to be that way.

That said, especially as of the past 5 or so years, most sites out there have skewed pretty unilaterally right or left, and I believe all or nearly all of the default subreddits currently have a pretty homogeneous political stance. It's not as insidious as the recommendation-based per-user bubble formation other sites have, but it might have a similar effect, especially if someone mostly just looks at their front page and doesn't subscribe to more than a few subreddits besides the defaults.



sort by: page size:

> Facebook makes it incredibly easy to step outside our bubbles.

pfft. only if you don't compare it to any other non-walled-garden online communities.

it's places like HN (and many similar tiny islands of interest), I come for the comp.science news, and because of that it does NOT actively select for other criteria (just inherent bias) and I meet people with wildly different views than my own, and I can even expect to have a decent dialogue.

i don't even need to "like" their political values, but i can still upvote their tech expertise.

[ edit: LOL, I hadn't read dang's announcement about HN's political detox week, which pretty much runs exactly counter to what I just said, haha :) though I welcome the experiment and appreciate the irony of circumstance :-) ]

check out the "general discussion" subforum on any random webforum community on a topic that interests you.

it's all already intersecting by default!

and it's Facebook (Google too) taking a very active role crystallising the bubbles into walls. Don't just try to step outside your personal Facebook-bubble, for it's actively working against you. Step outside the big bubble of Facebook itself.


>"I believe that Facebook now understands the political difficulties the news feed creates, but if Facebook were truly a dictatorship, they wouldn't be spreading multiple oposing propaganda, would they?"

Of course they would. They claim to be simply a "platform" for "diverse" opinions. Yet, at the first sign of someone being offended against something that's not popular, they shut it down. Even if they do it to the left and the right, what they're then essentially doing is forcing everyone into this average middle area where no one can say anything wildly-diverging from the mainstream narrative. But, from my point of view, the "mainstream" is currently rampantly left-leaning. So any "in your face" right-leaning points of view are very easily deemed to be in contradiction of some sort of "community standards", and are promptly shut down. I follow pages that have to regularly make numbered "back-up copy" Facebook groups/pages so that they can contact their followers after Facebook shuts them down, which is inevitable.

It really is all or nothing when it comes to free-speech. People self-censor and avoid the things that naturally make them uncomfortable. But if you shift the responsibility for that to some centralized entity, you're going to have problems that alienates many.


> I wonder if this is a reaction to recent claims that FB has a general political slant

I don't want to generalize too much from my own window into Facebook, but it does look like they may be in a feedback loop. As more and more toxic conservative garbage gets reshared, progressives start ditching Facebook. That means the remaining userbase likes that stuff more, so the feed gets tuned to make it more prominent. That turns off more progressives, who drop out. Rinse, lather, repeat.

When I go on Facebook now, it looks like Internet Fox News, and that's despite trying very hard to unfollow the people who reshare that stuff.

Even the left-leaning content on Facebook has gotten more extreme and virulent. It's just a miserable hellhole now of two tribes screaming at each other while the feed AI fans the flames.


> IMO that solves most of the problems

And creates echo chambers.

That's part of the problem with Facebook, once you start posting and interacting with certain types of content you get the same type of content more often. Leaving out the opposing viewpoints and giving you the impression that everyone agrees with you.


> Do you honestly believe the world is better off with this, than if facebook did something else?

Obviously the world is better off without insane political nonsense. Just like it's better off, obviously, without gambling or alcohol, and a whole lot of other stuff... I wish these things could be dealt away with somehow.

I'm trying to avoid the normative question of whether it's good or bad. The people who are most vociferously arguing it's bad are the institutions (like Vice) that stand to lose economically from changes in the Newsfeed recommender.

> the only way to get clicks... only way to properly do collaborative filtering

Facebook has probably had hundreds of highly educated and outside-the-box thinking people looking at the Newsfeed. Someone there has thought of and probably tried every approach to improve the quality of their recommendations using the metrics they are interested in. They probably thought of every possible metric to be interested in, and pathologically optimized the metrics they're interested in, long before advertising was the primary way they made money.

As an example, Facebook got rid of those automatic game posts. Those posted generated absolutely monumental levels of engagement and powered a huge economic engine (Zynga). Facebook got rid of them. Zynga was pissed, and while nobody cried a tear for them, you could certainly imagine some ViceVille.com talking about how bad it is for free to play Facebook game publishers that this automatic game posting thing went away.

Where are all the people talking about how getting rid of automatic game posts puts Facebook users into an entertainment bubble, cutting them off from a vast ecosystem of light entertainment? Some (really blowhard) people could seriously argue that. I don't mean to be a blowhard.

But if it just comes down to, 'Vice is cool, Zynga drools,' well then there's not much to talk about.


> I really don't see it as a forum for ideas or activism, unlike Twitter or reddit. but maybe that's just the bubble I'm in.

Exactly -- and that's the real danger of algorithmic engagement-driven content.

For you, it's picking harmless content because that's what you engage with. But once you start engaging with, say, some crazy cult, Facebook's algo will pick up on that, and start prioritizing content from the crazy cultists. In time, you'll get to the point where you're only seeing content from the cultists, and come to believe that's what everyone thinks, because that's all you'll ever see. YouTube suffers from this too.


> No one, not even Mark Zuckerberg, can control the product he made.

Well, one can control oneself, that's about the best we can do. As others have mentioned, don't use it.

I tried that. After years of not using FB at all, I missed out on some important news that was only posted there.

I decided to get back on FB, on my own terms. I created Bubble [1] and now I can visit FB with most tracking disabled, no ads, I only see posts from my friends, and even then only posts that do not link to news websites or contain political keywords.

My signal/noise ratio is usually about 50%. I love my clean feed. Works on LinkedIn, Twitter, others.

But the root problem goes deeper -- clearly not everyone cares or has the motivation to filter their own social media feeds. Some people like reading echo-chamber/fake news posts, truth is far down on their list of priorities.

I don't know what the big-picture answer is.

[1] https://getbubblenow.com (currently in beta)


>>But ultimately, it's worth remembering that it's hard to build a good system with bad raw materials. If people are interested in falsehoods and echo chambers, their social media will reflect that.

I wouldn't phrase it quite so negatively, but I think you're on to something here. If people don't enjoy going on Facebook, they won't go on Facebook. What do people enjoy going to Facebook for? Interacting with their Facebook friends, whom they likely friended in at least some part due to their shared beliefs.

"Fixing" Facebook in this regard means forcing people to interact more with entities outside their self-selected social group, when the whole reason they got on Facebook in the first place is to interact with their self-selected social group! Facebook would be foolish to fix this problem, it would drive their users away.


> It's _forcing_ increased polarization in politics? Be fair. We all have a choice whether or not to engage with content on Facebook

Yes, forcing, and respectfully, no, we do not have that choice, because Facebook decides what you see, not you. And that is the fundamental issue.

If I'm a conspiracy theorist and I constantly seek out content that validates my beliefs, that's a me problem.

If I'm just an inquisitive person that thinks not everything is what it seems and a platform starts feeding and amplifying conspiracy content because it thinks that's what I want and I become radicalized, that's a platform problem.

> You seem to have no interested in the people who actually _did the things

On the contrary - the people who actually did the things should be at the very center of this story, because at issue is essentially the hacking of human behavior.

To be clear, I don't believe Facebook is solely to blame here. What we're seeing is an amplification of human behavior and tribalism that predates the existence of social media. What makes the current climate interesting and problematic is that such tendencies are being exploited and encouraged by algorithms, and the platforms are no longer neutral to the problem.

> We had no insurrection. Many countries are able to do the same. There are a few exceptions.

And we had safe elections prior to 2020 as well. There were no insurrections here either...until January of 2021. I'd argue that we're just starting to see the 2nd and 3rd order effects of modern social media, and we can't necessarily just look to the past or to make conclusions about the present.


> It calls up a torrent of hate speech and misinformation with the right hand while trying to clumsily moderate with the left.

Not a Facebook employee (or supporter for that matter), but I'm curious if you consider this an issue of Facebook or of social media in general.

Not saying it's OK for FB because everyone does it, but you generally see the same dynamic of the "torrent of hate speech and misinformation" on Twitter, on Reddit, on Youtube even (personal experience: I have a family member that was radicalized by misinformation on the internet. It was all on Youtube, she had never even used Facebook).

I've noticed that people go a lot harder on Facebook than on other tech companies. I think Facebook's reputation is well deserved, but I do think that reputation should be shared with really all social media in general.


> Facebook used to be popular because it was, at one point, fun. If they're going to remove all the voices they think are "dangerous", what they're left with won't be fun or interesting. Just lukewarm, milquetoast, or worse... lopsided.

Honestly, I hope that Facebook becomes more restrictive and more censorship-oriented. That would make alternative platforms more viable and increase diversity in online media. I'm also 100% opposed to the idea that private companies should be permitted to or expected to function as the town square. That, more than censorship, is the real threat to freedom of speech and freedom of thought. Facebook, twitter, and youtube, by the nature of their ranking algorithms and the structure of the service, have a huge influence on what and how ideas spread without explicitly pushing or censoring content.


> Absolutely not. Facebook is the last bastion of people being able to speak their minds. And since I can select who I want to listen to, I like it. As it so happens, everyone who I follow on Facebook speaks their minds by posting pictures of their babies. This is perfectly fine by me. I am sceptical that the paternalists will let me be when they're done dealing with Trump et al.

How is it the last bastion of anything? It is a place where people seem to be able to speak their mind. It's not the last place. The Internet is vast, and the major social networks aren't the only things in existence on it.


> I only want to belong to a social network that values science and public health. Why shouldn't I be allowed to be a user of a website that has those same values?

I think it's a problem of expectations.

Some expect Facebook to just be a social network. Others, like you, expect it to be a safe space where unapproved opinions aren't allowed. Neither kind is bad per se, but I don't think most people expected Facebook to be of the latter kind.


> If it was months in the making on Facebook, why did Facebook not do more to stop it?

Two things:

1) If FB is seen to lean too far either left or right, it will create an opportunity for a competitor to emerge on the other side. FB is huge, so they have a lot to lose in that scenario.

2) They literally profit from the circulation of the very stuff that some would have them sensor.

I don't see what is even slightly surprising about they way they've handled this.

Edit: side note - I personally think #1 is likely at some point in the future. People have already gone a fair way towards segregating themselves into various social/political bubbles; why not even more self-segregation? If enough people want that, then either FB will enable it or a competitor will emerge to enable it.


> I’d say this stance is based on the presupposition that a solution implemented imperfectly is in fact good, or at least better than no solution implemented at all.

Yes. :)

> You say yourself that this would require FB to make (naturally subjective) judgements about what is or is not political. The very best case scenario would be that FB ends up marginalizing fringe views.

That sounds good to me!

> But would you even trust FB as an organisation to reach a level of neutrality where only the most fringe views were marginalized? I certainly wouldn’t.

Why not? We have no evidence to suggest they'd be terrible at it.

> The alternative is simply that FB leaves it up to the audience to make their own judgements about the credibility of the content they view, something I’d consider to be a far superior outcome.

Do you really consider the current state of affairs to be superior and/or ideal? Perhaps I would agree with you in a perfect world where more people were naturally distrustful of content, but we've seen that a large segment of the population is not particularly good at separating the wheat from the chaff.

Case in point: https://money.cnn.com/2018/01/26/media/russia-trolls-faceboo...


>Obviously the world is better off without insane political nonsense.

Then we agree? Filter bubbles are bad? The world would be better off if Facebook changed?

>Facebook has probably had hundreds of highly educated and outside-the-box thinking people looking at the Newsfeed. Someone there has thought of and probably tried every approach to improve the quality of their recommendations using the metrics they are interested in. They probably thought of every possible metric to be interested in, and pathologically optimized the metrics they're interested in, long before advertising was the primary way they made money.

I'm aware of how optimized it is, but it's built on bad principles. They use a variety of metrics to measure "engagement", including likes, shares, time spent reading a post, comments, hides, etc. But none of these metrics are perfect, and so optimizing for them leads to bad outcomes.

The only objectively perfect measure would be to actually ask individuals what articles they want to see more of, or less of, after they read them. To my knowledge they don't do that

I think your games analogy is perfect. Facebook games generated huge amounts of engagement and ad money, just like clickbait garbage is doing now. But it was having a negative effect on society, with countless people becoming addicted to clicking cows, and some losing vast amounts of money to their addiction. Facebook chose to sacrifice it for the public good (and their image), and I think they did the right thing.


> It appears FB is essentially a right-wing mouth piece, idk if that is a controversial opinion, but the data appears that way, I think.

I mean, mouthpiece can't possibly be the right term. Does anyone think they should forbid Fox News articles from being shared? If you don't want them moderating it away, and they aren't themselves recommending it, their role is pretty minor. The idea of Facebook is that when you post things some people you know see it. How do you show people what they want and avoid results similar to this?


> It doesn't bother me that companies show me what they think I want to see based on a recommendation algorithm.

It isn't a great choice for users to have to decide between debunking someone's Facebook thing in their post's comments and thereby boosting engagement and promoting it to all their friends, or just leaving it be unchallenged.

Or for Youtube, do you watch a youtube video of a demagogue to prepare arguments against them, knowing that Youtube will promote it more because you watched it?

We need some kind of way to interact with this stuff while getting the choice to not promote it as a result.


>It appears FB is essentially a right-wing mouth piece, idk if that is a controversial opinion, but the data appears that way, I think.

Given that the common narrative around Facebook is that it, along with all other social media, is engaged in a leftist conspiracy to purge all conservative and right-wing speech from the internet, I'd say it would be.

next

Legal | privacy