Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

And I disagree with your explanation. The journalist isn't simply committing a fallacy or misinterpreting an argument. The journalist went out of their way to accuse a political demographic of malicious intent. Out-group dynamics like this is something people actively shape (as demonstrated by the Robber Caves Experiment). Maybe we ought to classify it under something other than immaturity. But we can't chalk it up to laziness, mental fatigue, or #justPeopleThings as if it were an accident.

> do you realize that you're now going for a very emotional attack on my comment?

Honestly, I don't know what you're trying to prove here. My comment is invalid because I expressed embarrassment? lol?

Errata regarding your other threads. LW members call themselves "aspiring rationalists" to remind themselves that they have not yet "outgrown humanity" [0]. You also seem to confuse logic and rationality. When economists talk about rational agents, they're discussing an agent which employs a decision making algorithm consistent with the VonNeuman Morgenstern Utility Theorem [1].

[0] http://lesswrong.com/lw/h8/tsuyoku_naritai_i_want_to_become_...

[1] https://en.wikipedia.org/wiki/Von_Neumann%E2%80%93Morgenster...



sort by: page size:

It's worse than that. The problem is that being truly rational is hard, unpleasant work that few people want to do. If you read an article that makes your political opponents look bad, you can't just feel smugly superior, you have to take into account that you are predisposed to believe convenient sounding things, so you have to put extra effort into checking the truth of that claim. If you follow the evidence instead of tribal consensus, you will probably end up with some beliefs that your friends and relatives wont like, etc.

This argument is completely disingenuous. Your average person is capable of critical thinking and logical reasoning -- those who aren't are either wards under the care of another person. Normal people just think logically and critically in reference to local optima, and that's not something that we can or should try to program out of them.

That quality is also known as adaptability and it's crucial to successful survival and prosperity, for exactly the same reason that it's useful in mathematics: global optima are generally difficult to deduce, if they can be conclusively and authoritatively determined at all.

Saying Side X is "not being logical" or "can't think critically" is virtually always just a cop-out. It says you either a) don't understand or b) don't want to admit the validity of some of their concerns.

Most of the time when the other side's argument is understood, the disagreements are a matter of priority and/or credibility, not nonsensical thinking. And those priorities are usually determined intrinsically; values as such can't really be programmed or taught. They're the result of the years of experience each individual has endured in the real world.

A good example of this is that many engineers are known for a just-the-facts, no-frills approach. This is because engineers tend to prioritize facts and correctness over aesthetic and emotional value. Other people who don't do this aren't objectively wrong -- they just put different weights on the considerations, leading them to different conclusions.

Another example is outlet credibility. Your average Fox News viewer may believe that MSNBC is propaganda secretly dictated by the shadowy figures in the background, and vice versa. If you believe this, the logical conclusion is to dismiss or at least discount the perspective of the propagandist.

You cannot "prove" that one side is propaganda and the other side isn't, because it is impossible to definitely deduce the intentions and motives of other people. Reports that say reports from MSNBC were more frequently errant are of no value because you can just say "Oh yeah, says who? The same shadowy figures?" to that.

It is important to understand that humans hold a variety of totally non-falsifiable beliefs -- things that cannot be definitively proven one way or the other, even if you try, like the state of mind of the speakers we're around. These have to be approached from the subterranean to be understood, let alone addressed.

All we can do is understand that our own perspective is not the default or de-facto correct one, and that other people are entitled to their own assumptions and unfalsifiable opinions just as we are. They're entitled to their own credibility heuristics and decisions about who is worth trusting. People are free to make their own decisions and conclusions, whether we agree with them or not.

Understanding that is critical to learning that it's OK to disagree with people, without having to pretend that they're insane just to preserve your own ego and self-worth.


Nailed it. I don't understand why smart people on this site can believe in such nonsense. The fallacy of being blinded by ideology is such a strong weakness in human beings regardless of intelligence level.

The self-described rationalist community made their mind up about the article long before it was published, or even written.

I think you’re right that the rationalist communities feel like they need to defend themselves and fight back, while refusing to even consider anything that could be viewed as criticism of their community.

The modern self-described rationalist communities feel more like a place for people who have built their personal identities around the idea of being intellectual non-conformists than for people who genuinely want to discuss topics from different perspectives.


Modern people are driven by cognitive biases and emotion. They then attempt to justify these decisions after the fact with logical fallacies.

> The most common is a slow hardening of one’s opinions and perceptions that I consider a form of mental decline. These people are often susceptible to propaganda and conspiracy.

Maybe similar to people who always do the same exercise to keep in shape, and over time their body optimizes for that exact training so it becomes less effective, I wouldn't be surprised if our opinions become hardened because we thought the same patterns over and over.

But old people falling into propaganda and conspiracy theories? I have no other explanation for that besides mental decline. Younger people fall for those too, but many of them don't have the best mental condition to begin with.


The author anecdotaly notices a great many work place fallacies such as "confirmation bias", "the bandwagon effect" etc. Then, seemingly unaware of these separate categories, lumps them together and ascribes that lump as an innate characteristic of "smart" people.

Obviously an organization that succumbs to these fallacies is going to have trouble. Obviously adding "unsmart" people to the mix will only exacerbate it.

But that's not really surprising.

What's interesting is a trend, of which this article is an example, to ascribe the problems of the alleged meritocracy (which are real) not to the system itself but instead to clear reasoned well informed thinking. It's possible that this strange dislike of reason is triggered by pseudo-intellectuals on the far right cloaking themselves in the language of rationality to argue against global warming, minimum wages, the cigaret-cancer link etc.

But I'd like to see more conjecture/evidence about what's causing it.


Yet the author's memo was not taken very well. People aren't rational, the media is a page click machine, and all nuance is ultimately boiled down to a headline. What should we expect, that people are perfectly rational spheres?

I'm aware this line of thinking is like "blaming the victim" for non-neurotypical people who don't know where the line is. It's not a great situation all around.


Article about the fallacies of groupthink, and how rationality is largely based on our communities.

I think you are simply unfamiliar with lesswrong.

They write articles like this all the time, and they are usually targeted at biases which afflict lesswrong readers disproportionately. A quick look finds at least 3 articles of this nature in the past 2 weeks [1].

The article also does not advocate for lumping people into the meta-contrarian class and discounting their opinions. It explicitly says not to do that: "meta-contrarianism is a real tendency in over-intelligent people, it doesn't mean they should immediately abandon their beliefs; that would just be meta-meta-contrarianism".

[1] http://lesswrong.com/lw/2ql/error_detection_bias_in_research... http://lesswrong.com/lw/2po/selfimprovement_or_shiny_distrac... http://lesswrong.com/lw/2pw/the_affect_heuristic_sentiment_a...


Can you clarify what you think the central assumption in the article is? After typing the rest of this I think I may have misunderstood you. I think the central assumption is stated in the first sentence: "One oft-underestimated threat to epistemic rationality is getting offended." I don't think it's much of a leap to generalize that to "noticing others getting offended is an indicator that their epistemic rationality isn't up to snuff." But it is inconsistent with a common theme around LW that learning about biases etc. should only be about improving yourself, not about noticing errors in other people. (I say why not both, while just being careful and not over-confident with the latter?)

I should clarify that I don't think someone's "I'm offended" level completely measures their rationality, it's just a useful indicator. (They may have a Blue v. Green issue that just hasn't come up yet, or they may fail hard in the many other ways humans can be irrational, or my "this person is offended" detectors might be screwy.) I agree with the assumption that many, and probably most, seemingly rational people can become quite seemingly irrational when prompted with the right subjects. Or as Tesla put it, "The scientists of today think deeply instead of clearly. One must be sane to think clearly, but one can think deeply and be quite insane." Blue v. Green dynamics, and generally group identification, are pretty embedded in our neurological makeup, and to me only seem connected to "being offended" in the way that such a mind state makes it easier to fall into those (and other usually undesirable) modes of thought.

I think all bridges can be burnt given enough time, but not all of them through trying to offend alone. With close friends who share your mental structure of finding it hard to be offended, let alone difficult to express it without a trace of irony, it takes other methods to destroy that friendship. The act of incessant trolling by itself could work depending on the person, regardless of if that person finds anything said offensive or not, simply because if I spend all my time trolling them, then I'm not worth their time.

I read Gulliver's Travels around the same time as "A Modest Proposal", I agree it's still relevant and insightful. I also think you're right about esr's post having a problem by not defining "racist", and leaving the reader to do it themselves. (There are many definitions out there.) But I do think people who define "racist" as including something similar to "citing statistics that don't paint a pretty picture of all the various races, where such a pretty picture would show that such a simple observable never correlates with anything negative, and giving reasonable advice assuming those statistics are accurate" are in error, regardless of the accuracy of the statistics.


The rational response is

...falling on deaf ears. I have been saying for years that hacker culture needs to develop greater emotional intelligence and meet people where they are instead of lecturing them with arguments that make their eyes glaze over. Politics is not a function of logic.

Your best option, if you live in the UK, is to roll your eyes eyes and vote for the beardy communist despite his obvious faults. Logical arguments are not compelling to people in the grip of an emotional rush. The inability to assess and adjust to peoples' emotional states is a kind of social stupidity.


The majority of people will assume it's not true because as humans we bias towards optimism and lying to ourselves. We choose to artificially scaffold logic to fit our positive bias and you're going to see a lot of arguments in this thread attempting to refute this study. Most of those comments will be of that nature: composing different facts and figures and evidence in a specific way to fit their own desired outcome.

Studies prove that the majority of people are like this and I'm one of them.

Therefore I'm going to what everyone else does and that is work and drive to work, spend time with my family, spend time with my friends and not do anything different everyday. I guess being self aware of my own biases makes me slightly different, but in this case people who are self aware probably just choose to avoid thinking about it. We got daily problems to deal with and the environment is still too abstract to consider even when there's a lot of evidence suggesting it's too late.

This is the reality of people. It's predictable. Even people on this thread "claiming" to do something about it likely aren't doing much or doing anything meaningful. What's written in this thread is rationalization to keep doing what they're currently doing.

Only a very small small small percentage of the population can actually be genuinely panicked by what the evidence suggests. They will be making drastic changes to their own lives and attempting to change the world. People who build and construct their lives around rationality are the ones we classify as extremist.

I'm not saying these "extremists" are smarter or have higher IQ. It's more of a behavioral trait among a small portion of the population. They lack the normal biases people have, and I think this may have a small association with lower IQ as some of them are unable to weigh the rational logic against the consequences of going against the grain of popular opinion/behavior.


A small minority of people are rational enough to forge new truths in the face of conflicting and complicated information usually in a small narrowly focused way. A much larger minority is capable of digesting and making use of the work product of the former group in a productive way again within the scope of a broader but still narrow scope.

The majority is too stupid to make up their own minds and needs to be educated at a young age to accept the work product of prior generations experts because they are just too unintelligent to evaluate it for themselves. This is literally most people.

The fact that this is unpleasant doesn't make it untrue.


Right, but my point is that while regurgitation is not intelligence, the act of doing so is not enough to claim the regurgitator itself is not intelligent. Otherwise, you'd condemn a good half of humanity with the same reasoning.

Or more, if we're willing to consider most peoples' reactions to at least some political topics - they just ignore the context and repeat the dogma they've learned (some more than others). People rarely stop and think for everything.

The problem here is that the LLM has learned that everything is political, and can be responded to the same way.


I'll take a look at this link later and possibly look up that book too. I do want to say a few things however that so far it seems every replier seems to be mistaking as a larger whole.

These people seem to think that humans aren't at fault for the actions of their creations. They seem to think that algorithms are capable of having agency for their actions... This couldn't be further from the case.

As for the content of your reply however, thanks for being a little more nuanced in it all. Yes, a self contained echo chamber is definitely not going to help things. That said, what you are saying is effectively saying as well "If you proofread your own comments, you are risking radicalizing yourself off your own ideas." But further than that, you are also stating "Those ideas are just half baked".

Well in some people's cases that may be true, but as per the usual problem with most of these arguements; it leaps right past the true actionable agent in the problem and heads straight for blaming X thing that people have little control over. Psychology is one of those things that people have less control over than other people would like to believe. I know this because I have had to literally retrain psychologists over stuff like this during the times they figured they were fixing me. HAH, jokes on them.

(It was always something like "Those people don't control your actions" and I would always reply with "When did the chain of actions and consequences begin in your mind over this subject matter?". To this they would usually start it with 'me', when I would correct them and say "it starts with the person before I, because my actions are based off the consequences of their actions. Therefore while they may not 'Make me' do anything, they certainly do present possible options for me to take, and I tend to take the ones that they like the least; usually due to them being the correct action."

Now, from that little bit of a side story, do you see how your opinion of psychology being the answer might be a bit of problem in its own right? Your assumption basically ignores the fact that there is a chain of consequences for every action taken by every single person connected to said subject matter.

So while the person proofreading their comment might be 'radicalizing themselves' using their own propoganda; you are making the mistake of thinking that just because they have X opinion other than yours, it must be bad. This is a common mistake made by... drum roll please... radicalized people. Especially so in regards to folk who are on the internet far more than they should be? I end that with a question mark because it's mostly an observation so far that I cannot concretely say is for certain the problem and cause to the problem.

And so now we are back to the crux of your point. The half baked idea that gets reiterated over and over again due to be re-read. How half baked is my reply to you? How half baked is your reply to me? Was my original comment even half baked to begin with, or is that just your view of the situation through your own subjective and thus potentially biased viewpoint? Get the point?

The problem is many things DyslexicAtheist. But all of them stem from one single source. Humans. This is why I blame them and not the algorithms or the psychology itself. Why?

Because both are creations of humans. Psychology is just an attempt to understand something we didn't create, so we created a method of understanding it. Faulty as it may be, it has its uses. Algorithms are the same thing in a way. We created the fabric of which they exist, but we still don't understand fully how they operate. Or to put it a better way, we don't fully understand yet how to create them to make them operate exactly how we want. And then furthermore from there, we also do know how to create them well enough to make them do exactly what we want from time to time as well; which has its own problems.

Youtube's algorithm is a good example of the creators only having a partial understanding of how their creation works. (If they are to be believed every time they say "we aren't sure how this works")

Twitters is on the other hand a good example of them making an algorithm to do exactly what they want it to do, and it does it really well. To horrific results.

Both together are my point insofar that algorithms and psychology are only part of the general overall answer. They are not the end all or be all. Humanity is that, in regards to this problem. We are the one factor that if removed, all the problems cease to exist. But we can't exactly go removing ourselves... Well, we can... but that has new problems attached to it.

Meanwhile, the people with the half baked ideas as you put it, are just stating their opinions. You may call them half baked, but who are you to judge others opinions when yours probably aren't that great either to begin with? right?

And that right there is the real source of all the problems. People's ego's.

Once you get over your ego, all of this becomes much easier to understand and accept. But getting a human to do that is fucking hard.


I don't think this is fair, and I generally am not a fan of the community either.

I don't think they are trying to impress each other, or are contrarian for the sake of it. I think it's a fair amount of legitimately smart people who don't have much in the way of life experience, may be in part on the autism spectrum, and tend to be engineer/rational types who think systematically more than intuitively. I mean they construct and deal with systems.

This leads to sort of a naivety that tends to look for elegant explanations of a sort and is vulnerable to them. The all-explaining system of thought that in general intellectuals tend to get ensnared by, but is self-contained and self-perpetuating. The idea of the messiness or stickiness or even corrosiveness of life on these systems tends to not be entertained much, as it reduces the power of rationality.

So the rationalists get really vulnerable to these kind of left-field ideas like AI risk, or neo-reaction. These ideas are often rationally elegant in the way many negative systems are; they are a novel way of explaining things. The red pill/manosphere stuff was similar; as a system of thought it's a lot more elegant and explanatory to its audience than the alternatives, and even criticized actual things as well as its own untrue things.

I guess I'm saying is that it's more the rationalists are vulnerable to getting captured by those contrarian ideas you describe than they are malicious or doing it to flex. It's not really unique to them, academia goes through it, philosophers go through it, artists go through it, etc.

As for the cultish stuff, its apparent to me that a lot of rationalists are kind of adrift in real life due to the great unpersoning of religion and its replacement with woke or trumpian politics. Thats part of why the grey tribe was coined, neither left nor right. They need meaning as well as anyone.


It seems like your uncharitable reading was the one that actually intended, which is unfortunate. I guess that’s the perils of assuming good faith.

I do think it is good to be precise here, though. The human brain really seems to like viewing groups of people as hiveminds.


It is an evolutionary trade off due to how human beings instinctively handle cognitive dissonance.

It is very amusing and somewhat depressing on how the most intelligent of individuals can be reduced to a frothing reactionary if you know his/her life history and deduce their biases.

next

Legal | privacy