Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

Completely agree. Simply pointing to someone actually tracking censorship activities. It would be nice to see an inclusive set so we the humans could make our own decisions.


sort by: page size:

The perspective of the specific people who might be censored seems pretty valuable, even if the Open Rights Group delivers a clearer argument.

Yes more censorship is what we need.

We need more censorship not less. Good call.

Is "cracking down on coordinated bs" not also censorship?

How about letting people simply evaluate the information for themselves?


It's a hard problem, but I'd argue its one that's been effectively moribund for effort and research around the time of everyone dropping POP3 for their email.

My entire problem with censorship, philosophically, is some unwanted third party making demands on my information input. The goals of moderation need not be at odds with my goal to determine for myself what I wish to read. Categorization and shared filter lists could go a long way here.


What kind of alternative would you like to see? How do you propose to evaluate if it's preferable to censorship?

The system is working great except there isn't enough censorship?

It's simple, many people are fine with censorship as long as they're in charge of what should be censored.

I agree, and this probably isn't good; but, I want accuracy in what's being said when it comes to censorship. We shouldn't win against censorship by hyperbole but by the unfiltered reality being so horrible that people wouldn't want it, anyway.

Exactly! I’m being downvoted, but I’m sincerely suggesting that we train a full spectrum of ideologically biased censorship engines and then let people pick which ones they want to use.

It’s no different from parental filters.


You're making the mistaken assumption that the people who need to think critically have all of the information or even the right information to do so. Most people are making judgements and coming to conclusions based on less than whole data. They don't have time to exhaustively investigate everything about every topic and make sure that all of their sources are flawless. You're asking for perfection. It's impossible.

This is not reasoning for censorship by the way, its reasoning for systemizing trustable sources of information. Someone will care enough to get it right and fair so we should promote them and ignore the others.


I think we need more censorship quite frankly.

It seems dangerous to be so condoning of censorship. Let people make up their own minds about the validity/importance of the content. The more information that is available the better.

I wish they'd release the whole list of banned topics so people could decide for themselves what was suppressed and why, instead of having it drip fed to them.

My opinion is based off the fact I just saw a room full of people cheer for the censorship of politicians.

I'm not of the opinion that reduced visibility and annotations will solve the problem. I just don't trust people like the above to know where the slope is, and so I don't trust them with tools for censorship.


I want accountability not censorship.

I accept the basis of your argument - that point scoring is unhelpful.

However, I feel you're missing the point when discussing censorship.

If censorship is to be justly applied, there must be some objective measure of what is offensive - which then must be applied equally regardless of political alignment. This is definitely not happening at the moment.

Personally, I think a better perspective is to reduce and/or avoid censorship altogether, especially as far as internet infrastructure is concerned. Corporate entities shouldn't be enforcing the views of their stakeholders on the individual.

Free speech is free speech, even if I don't like what you're saying.


maybe we just need a better system to measure 'trusted speech'?

Censorship is a tool, but when all you have is one tool... well...

We need better tools, not more censorship!


On a practical basis, I would agree as somebody that fits a number of protected categories, and have had to work around this.

Ethically, this is why we have organizations like the ACLU, ADA, and many, many others to be the enforcement with teeth. It's an uphill battle. I would disagree in the sense that pushing back against censorship - particularly of minorities - has many battlegrounds, not just enforcement with teeth. Greater social awareness/acceptance, education (history of blacklisting of sexual minorities for example), grassroots activism of simply supporting authors at risk of being deplatformed ... no sense in ignoring a tool that's available.

next

Legal | privacy