>Let science do its thing - which is inform and educate. Let policies be made by those with more full understanding of the system into which changes need to be effected.
But what about when scientists collaborate with the policy-makers to shape public perception about how to inform and educate people?
One prime example is last years long and detailed study on how there was no "gay gene". In effect, homosexual activity and inclinations could not be shown to be the result of any specific gene expression after many trials and searches for such.
The scientists, worried about how this result would look to the public, worked with LGBTQ advocacy groups to shape how the paper would be released to the media and explained to the public[1].
Now, a sizable portion in HN's demographic will feel this is all well and good, but how far should this go? The uncomfortable truth here is that people are not "born gay" as we have been told so often. And that has been a very important talking point in the past two decades when debating about gay rights. Many people were convinced to support gay rights for this reason alone (however we might feel about that, it's true). They very well may feel duped by scientists and no longer "trust the science". How far should scientists go to shield the public from uncomfortable truths? And how much will the public continue to trust them when such veiling of the truth is made known to them?
> Science should be the beginning of a chain starting at knowledge and leading to policy
I sympathise with the goal of separating science from cultural politics. It feels like I'm against the tide here but I actually think the great majority of the guidelines strengthen this chain by promoting clarity and eliminating cultural assumptions. For example, specifying that the commonly accepted and/or scientific meanings of words like "sex" and "race" are used, and that the language adheres to bias-free practises.
A thought experiment: if you were an editor and received a manuscript which was perfectly fine up until the last line which said "and as well all know, homosexuals are weak-willed sissies", would you still go ahead and publsh if the author completely refused to remove that line?
>Isn't this the definition of how science should work.
There are two problems. First is the easy problem which is that #3 doesn't always happen. The data gets slightly cooked, the study gets redone, explanations are given as to explain away the results. The way it is done can be from outright dishonest to quite acceptable.
But the bigger problem is that there are some hypotheses that are not thought up or tested, or are done so in low numbers. Namely ones that would be political/career suicide, but even those that could hurt funding or that would have biased a person from ever becoming a researcher. For example, say you wanted to test an experiment that gay women were more violent to their domestic partners that straight men, straight women, or gay men.
Finally, even when results may be found, they will be attacked as being untrustable because the author must've been biased to even go looking for such results.
> However I also think Science does need oversight and agreement from society on what is acceptable or not, as history has often shown left uncontrolled bad things can happen.
I'm curious as to what you mean by this. Are you talking about avoiding doing or publishing research that could have undesirable social effects? E.g. in the hypothetical world where it had been discovered in 1980 that there was a gene that made African Americans less bright than white ones this research should not be published?
Or is it that you are worried about biased scientists faking results to harm disadvantaged minorities?
>> Look, science doesn’t make everything clear cut 100% if the time. Science is the act of making theories, and looking for evidence to dispute those theories. Sometimes, in some areas we have to be OK with not understanding things. That’s part of science, it’s not it’s limitation. By admitting ignorance in some areas, we open the doors to the pursuit of knowledge.
> I think that's the key takeaway of the article, to be honest.
If that's the key takeaway, why write an article about it? That's how science is supposed to function. It's not controversial.
>> How about instead we suspend our beliefs and wait for data, rather than trusting someone just because? It’s okay to say, we don’t know!
> This is what I think the author is saying, though. It's a political piece--we can't universally suspend our beliefs for the purpose of making policy. But we can try to evaluate what science is a closer approximation to "true" than others.
It would be REALLY helpful if more politicians suspended their beliefs for the purpose of making policy. And if there's an area where scientific consensus has been made, work your policies around that instead.
>> Because none of those other fields embraced abandoning beliefs in the face of evidence and data. How one could even compare them with science clearly demonstrates and attempt to misconstrue facts.
> You can say this with the benefit of hindsight. If you were, say, an American social scientist in the 1920s, you might be supporting eugenics, as it was quite common at the time. If you dogmatically defended eugenics as being "science," called people who questioned the value of it deniers, and equated it with truth... well, you would be wrong. And if you can acknowledge that it happened in the 1920s (and numerous other times), then it can be happening today. It's about being open-minded to the possibility that science (and its process in practice) has varying levels of value, and being able to rationally assess that is important.
Eugenics wasn't science... it was an attempt to use artificial selection as a way to mold human society. The artificial selection is the science part. It's use and justification for eugenics was just abhorrent. As for things like phrenology, etc, those were never accepted as science... just touted as so by the pseudo scientists.
> “We are deeply concerned by the implication that managers are not responsible for homophobia.”
So now you're no longer just responsible for your own prejudice and your own actions, but also responsible for other people's prejudice and other people's actions.
It's also funny that scientists are often quick to criticize non-experts for weighing in on their area of expertise, but seem comfortable claiming authoritative knowledge of the historical facts and ethical arguments in this case, while ignoring the judgment of actual historians.
> The science is indeed pretty clear, even though many do not like that.
It's evidence for how much political influence is in academia today. In my country of origin, Sweden, believing this kind of science is very politically incorrect.
It's stupid, of course our brains are wired different it is the most important organ in our bodies. Who really cares if we prefer different things? We should celebrate our differences, not fight it.
It's funny when people try to fight our biology, it's a loosing cause and you will most likely make people unhappy.
>I always cringe slightly when I see a headline like this on HN, imagining it feeding unfair prejudices about protected groups
"Protected" by and from who? Whose "unfair prejudices" are really at issue here? I always cringe when scientific studies or other bits of objective reality are suppressed or downplayed because of the prejudices of people who believe that arbitrarily cobbled together groups of people by those who think they know better need to somehow "be protected".
>You be you, but please remember that prejudices and confirmation bias are problems
If someone has a problem with science and objective reality, that is a personal problem they should deal with, not a problem with science and objective reality. Unfortunately there is an ever-growing mentality (especially among younger people) that if objective reality is somehow offensive to one's sensibilities then objective reality should be ignored or explicitly rejected all together.
> take "inclusion", there's a lot of work showing that a more diverse, inclusive population leads to better results (in social siences etc.) so it's obvious for me that people will use it more
I imagine it would be difficult getting a paper published that shows otherwise since diversity, equity, and inclusion have seemingly become tantamount to unquestionable religious dogma.
For example, you have the Biden administration publishing this blatantly contradictory guidance:
> The scientific integrity principles and best practices identified in the report aim to ensure that science is conducted, managed, communicated, and used in ways that preserve its accuracy and objectivity and protect it from suppression, manipulation, and inappropriate influence—including political interference.
>
> […]
>
> Scientific integrity policies should be modernized to address important, emergent issues of our time. They must advance diversity, equity, inclusion, and accessibility
Then you have Nature with this “ethics guidance” that seem to suggest it would be considered unethical to publish a study that challenges the present narrative of diversity, equity, and inclusion:
> Academic content that […] promotes privileged, exclusionary perspectives raises ethics concerns that may require revisions or supersede the value of publication.
> Science has for too long been complicit in perpetuating structural inequalities and discrimination in society. With this guidance, we take a step towards countering this.
> Consideration of risks and benefits (above and beyond any institutional ethics review) underlies the editorial process of all forms of scholarly communication in our publications. Editors consider harms that might result from the publication of a piece of scholarly communication, may seek external guidance on such potential risks of harm as part of the editorial process, and in cases of substantial risk of harm that outweighs any potential benefits, may decline publication (or correct, retract, remove or otherwise amend already published content).
> Regardless of content type (research, review or opinion) and, for research, regardless of whether a research project was reviewed and approved by appropriate ethics specialists, editors may raise with the authors concerns regarding potentially sexist, misogynistic, and/or anti-LGBTQ+ assumptions, implications or speech in their submission; engage external ethics experts to provide input on such issues as part of the peer review process; or request modifications to (or correct or otherwise amend post-publication), and in severe cases refuse publication of (or retract post-publication) sexist, misogynistic, and/or anti-LGBTQ+ content
> "I believe this is a public perception problem."
I see it differently. The problem isn't the public. The public has actually done a commendable job sensing what's been going on. It's not fair - however unintentional - to tie the public to the problem. It's bad enough Science keep denouncing the public, when it's Science who has mucked up it's own reputation. The pubic is a mirror.
Science, much like journalism, has been drinking too much of its own Kool Aid. It has forgotten it can't just talk, but it needs to walk that talk, as well.
Science would also be wise to show some humility and admit when it's gotten something less than perfect. Yes, that is part of the process. Does Science understand that? Instead, it just sweeps those aside and keeps self-professing it's eternal greatness, and how we should bow our heads, etc.
Bill Nye? Neil deGrasse Tyson? These are the face of Science? Me thinks they got the wrong end. They are doing more harm than good.
> If scientists are trying to warn the public about the dangers of climate change are they trying to manipulate public opinion, educate the public, or convince the public?
Depending on the person, the methods used, and the level of integrity maintained, some combination of all three.
If you suppress legitimate criticism and intentionally distort facts, you are engaging in trickery.
If you correct misinformation and do your best to present an accurate representation of your understanding, you are educating.
Generally, scientists tend to do a pretty good job of focusing on education, but the dynamics of the discussion around the information they share tends to cloud that distinction.
The problem is that many groups have decided that trickery is more convincing than education and that should make compromising ethics and integrity mandatory. (While other groups seem to have had no integrity to start with.) As a result, the discussion of the distinction between education and trickery and accusations of trickery often drown out the actual attempts at education.
> Logically, you'd think that science and information sharing would make us all more liberal and taboo-proof, but I'm not sure that is happening.
I think you're right - but i also think you're underestimating the lack of science minded individuals in the ones who are less liberal (such as in the US).
> This is especially on the Left, who are too ready and willing to spark that ruin, viewing legal systems and power only as a tool to control their opposition, who has valid stances on myriad topics. On the Right, they complain a lot, but at least they aren't using any means necessary to actively change the fundamental course of people's lives
I think by saying this, you invalidated all your other points, because you made it political and showed an agenda, which gives the impression you're arguing in bad faith.
If you look politically, it is just as true, and historically much more so I feel from the American right, which has normally had the church and the puritans on its side, and has always fought to get sex, LGBTQ, minorities, and even science removed from popular channels of discourse.
Also, as someone who follows the science, I'm not seeing 100k+ researchers with published data refuting the science. Doctors are not all scientist. Most practitioners have their own belief system, often from their own daily biases. And in the medical science community, it turns out, those things are being investigated, but are still far from being unambiguous and evidence backed. The policy makers are taking risks one way or another, because they're having to make decisions with limited scientific analysis, data, and experiments.
When you think of scientific consensus, what do you think it means? It is a confidence score in the current state of data and experiment. It is an assessment of given what we know today, what is most likely true. You can take bets against it, because it is probabilities and not certainties, science is always probabilistic unlike religion. But if you're a policy maker, taking a bet against the current odds seem like a gamble you shouldn't be making.
Thus from my stance, this is all political, and science is doing the right thing, and most policy makers that follows scientific consensus are just playing it safe with the odds.
The people who are trying to discredit science and the consensus are taking political bets, they want to discredit things to gain power, in the off chance the consensus is wrong, they win big politically, in the many more likely odds they'd be wrong, most people won't notice and it won't make the news.
> So read the science. Listen to the science. But read up further, and make educated decisions. Don't just listen to "experts" blindly.
This is not feasible advice. I can't read studies and correctly interpret and summarize them in every area of science which could affect my day-to-day decisions. That's insane.
We need to work on improving the trust of our scientific institutions, so that we can continue living our lives and focusing our efforts on our specializations. This may involve changing the institutions themselves to fix legitimate issues (like the funding fiasco), addressing misunderstandings by the public that also contribute to mistrust, etc.
There's no reason to throw the baby out with the bathwater. As with many issues of our day, the challenge will be in disciplined focus on the issues themselves and what changes we should make to address them, instead of surrendering to tribal bickering.
> Laymen might think that's a bad thing. But it's not... it's exactly how science is supposed to work. The problem is, people need to be told an absolute, but science can't do that for the most part.
Scientists have my full support to stumble around in the dark and figure things out. But it's not the laymen who desperately need certainty. It's the scientists themselves when they deign to make policy demands based these half-baked theories de jour.
> The general public are not scientists, yet they distrust institutions
Of course. They'll lose trust the soonest when they see political statements coming out of the WHO, etc. And they should. An organization that can't stay neutral can't do good work in a politicized climate.
> Few can look at data and know what it doesn’t answer, or what questions should be asked but aren’t being
No, this is actually one of the easiest things to teach. "If you were paid to sneak in and ruin this team's research, what simple bias could you add? If it was the other way around, how would you protect against this?"
Teaching people to recognize a poorly controlled study is not difficult. But it's not PC these days because people are supposed to shut up when told that all scientists agree! We did this to ourselves a decade ago by tolerating the climate change debacle attacking dissent, now everything is fought through fake consensus.
> Correlation is not causation, but that escapes almost all of us
When it's not it almost always means there's an unseen factor though, so that wonderfully dismissive statement is usually just a red herring.
> The people who can (or do) communicate with the public are ...
Lying. They're lying. They're saying things they can't know as if they're fact. The WHO isn't saying that a lab leak seems less likely than other things, they're saying it didn't happen. Unless they can point to what did happen then logically they're not able to make that claim.
People might be scientifically illiterate, but luckily spotting lies is much easier and gets you mostly halfway there, able to spot when you're being manipulated even if you can't tell what the truth is. The real problem the media is having is that people aren't quite as dumb as expected.
> Even better vaguely word it so it resonates with main stream media
This explanation doesn’t resonate with me. Research most often has incremental results that need to be carefully qualified. Isn’t the far bigger problem that mainstream media takes subtle research results and “simplifies” them for the public by adding certainty and often mis-interpreting the results completely?
Political agendas have been recently systematically trying to erode trust in science. (Because science and truth does threaten some politicians.) The idea that science can’t be trusted as the high-level summary is exactly what some people want, and it seems to be working. But what is the alternative? We have nothing better. The point of science is to try to protect against motivation and agenda, and it does work sometimes. Even when people are motivated, when the methods are reproducible and the results are peer-reviewed, that does help filter out some of the badness. And if it’s not enough: what should we do to improve it?
> It’s potentially damaging to say “most of science is wrong” and just stop there. That’s a misleading framing in my opinion. In order to fix the funding problems, society as a whole needs to have trust in science, to believe that the majority of people doing science are politically impartial and also not wasting money or lining their own pockets, to believe that scientific progress is human progress.
Science is losing the battle for public trust because it wants to simultaneously be an infallible source of truth and this messy, chaotic discipline where we tumble towards an approximate answer. It gets defined as a one or the other when it's convenient.
In the first breath:
Oh, X% of all published papers are wrong? No big deal, that's just how science works. Can we have another 100 billion of taxpayer dollars please?
In the second breath:
The Science Says vaccines are safe and effective. Take it or get fired.
> Science is basically the most censored field out there.
Science isn't censored. It's the opposite. It is open and it is tested. You are allowed to claim/hypothesize whatever you want. And you and others are allowed to test it.
There was a time when science was censored. Such as when people started to hypothesize that the earth revolved around the sun. Or when germany started censoring "jewish science".
You are misattributing "testing one's claims" with censoring one's claims. Science doesn't censor.
But what about when scientists collaborate with the policy-makers to shape public perception about how to inform and educate people?
One prime example is last years long and detailed study on how there was no "gay gene". In effect, homosexual activity and inclinations could not be shown to be the result of any specific gene expression after many trials and searches for such.
The scientists, worried about how this result would look to the public, worked with LGBTQ advocacy groups to shape how the paper would be released to the media and explained to the public[1].
Now, a sizable portion in HN's demographic will feel this is all well and good, but how far should this go? The uncomfortable truth here is that people are not "born gay" as we have been told so often. And that has been a very important talking point in the past two decades when debating about gay rights. Many people were convinced to support gay rights for this reason alone (however we might feel about that, it's true). They very well may feel duped by scientists and no longer "trust the science". How far should scientists go to shield the public from uncomfortable truths? And how much will the public continue to trust them when such veiling of the truth is made known to them?
[1] https://www.nature.com/articles/d41586-019-02585-6
reply