Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

> I think that if a site is bad enough to warrant being taken down, it should be taken down by the authorities that host it.

That is essentially saying that nothing is bad enough to warrant being taken down ever.

People shouldn't have to tour every possible jurisdiction on earth to have something taken down in the jurisdiction they live in.



sort by: page size:

> It’s for this reason I see this as a failing of the legal system. If a website is so abhorrent as to be illegal, then you should be able to open an investigation, and investigators should be able to get warrants to seize the site hosting structure and take it down.

They have been investigated a lot over the years, the operator works with law-enforcement to provide the information they request, and the moderators are pretty quick at removing any illegal content. One can argue that the police concluding that there's nothing illegal going on is because of their incompetence, but it's more likely that the US holds the 1A sacred and has no legal reason to take it down.


> but I mean you need some censureship else you get child porn up there.

Nobody here objects to websites taking down illegal content. We're complaining about when they take down content that's perfectly legal.


> No content should ever be taken down automatically just because a bunch of random people report it.

Serious question, why not?


> If it isn't illegal or incitement of violence, it doesn't need to be removed.

But it can be removed, because its one of many millions of privately owned websites on the internet... This is the great thing about the internet, anyone can create a website and run it however they want and users are free to visit whichever sites they like the most. This is a win-win for everyone except for those who want to control what site owners do with their own websites.


> just double down removing offending content and defend from the lawsuits

This is unsustainable though, there is just too much content to filter. There is also the fact that lacking such protections the site will be targeted by bad-faith actors who intend to harm the site by posting lots of legally problematic content.


>> There's just no way for a human to know what's in violation and what isn't.

People can host their own stuff. Simple. Don't host content you don't know the origin of and can't document you've got the right to do so.


>If we ask these small companies to take on content removal obligations, we should not expect nuanced decision-making or robust appeal processes. We should expect legal and important sites from across the political spectrum to go down because someone complained about them.

This is a ridiculous statement. Domain registrars are already required (by ICANN) to receive, investigate, and respond to abuse complaints.


> removing such content could possibly be a violation of the First Amendment

No, it couldn't. Websites have the right to remove any content from their websites for any reason. They are not bound by said law.


> It’s not that Google should be forced to carry stuff, they should be forced to not discriminate because they don’t like it. ISIS propaganda is illegal and should be taken down for that reason.

Bright-line rules like that worry me. This is, a lot of stuff is subjective -- the line between "legal" and "illegal" isn't so clear or immutable as one might naively guess. So if something more binary -- like host-or-remove -- is tied to such a fuzzy, dynamic determinant, it'd seem to give rise to all sorts of problems.

For example, say we forced big companies to host all legal content, but remove all illegal content, and then we want to know if something controversial is legal (e.g., taxes on Bitcoin back when it was newer). Then someone could post two images: one telling people to pay taxes on Bitcoin, and another telling people to not pay taxes on Bitcoin. Then the hosting-company would have to remove exactly one of those. By contrast, a hosting-company could normally just remove stuff they're unsure about because they're not required to host legal content, sparing them the burden of having to properly determine the legality of everything.

Basically, the problem is that we'd be stripping hosting-companies of their freedom to operate in safe-waters, forcing them into murky areas and then opening them up to punishment whenever they fail to correctly navigate those murky waters.


> This is to comply with copyright law.

If that's true, it will be a better resource for humanity if it charts a way to be hosted in a jurisdiction where copyright law doesn't require this kind of censorship.


> A website is a public property.

No, its not. It may be in public view, but that's a different issue.


> child porn, terrorist videos, and copyright violation

One of these is not like the others.

> "such things shouldn't be allowed on the Internet"

This is why people created things like Tor in the first place. The internet is not a country. Nobody should get to decide what is and isn't allowed on the internet.


> I have no interest in fighting with my hosting provider in addition to everything else

This is really the issue here. The hosting provider should err on the side of its customers as opposed to random third parties. There are a lot of edge cases on what can or should be legally hosted. The hosting provider may have the legal right to take down controversial content, but it's an absolutely cowardly thing to do. The free speech-friendly approach would be to side with its customers UNTIL the third party has obtained a court order that the content be taken down, rather than insist that their own customers get a court order insisting that the content stay up.


> Because if it has that right it should have the full responsibility for everything it publishes. With no leeway

I don't understand your reasoning. When you say "full responsibility" I assume you mean legal responsibility, as in, a website that removes posts by users should take full legal responsibility if users post illegal content, but I don't see the connection between the two ideas, the illegal content was still posted by a user, what about removing posts makes the website now legally responsible for the actions of users. This would also make the site legally impossible to operate, it sounds like you're suggesting any site that removes user content should be pushed into shutting down. Am I misunderstanding?


> You would have to go ever each site scraped and read their terms of service / agreements.

Someone putting up a bunch of words doesn't bind you to a contract with respect to scraping or anything else. (At most they can give up a right. They can put something in the public domain, for example, by giving up their copyright.)

To give some extreme examples: What if the site says you owe them a million dollars if you even look at the site? What if someone put a sign on their lawn that says anyone stepping on the grass can be legally shot? Are you OK with those just because they said so?


>I think many people aren't going to stress too much about breaking a site's T&C because a) nobody ever reads them and b) they're practically unenforceable in many places.

Tell that to Aaron Swartz.


>...shouldn't take every measure possible in preventing its platform from hosting some of the most abusive and criminal content on the planet.

If you want to genuinely prevent the most abusive and criminal content on the planet, then - as the comment you're replying to suggests - the only sure way to prevent it is to not allow it to be hosted/connected to/shared/etc. anwyhere across the internet but since you can't ensure that, then we default tothe internet being the medium that makes that possible/happen, so...


> "I don't agree with you -> I take down your site". It's not a civilized reaction.

Yeah, it's just not right engaging in SOPA-like activity.


>there's nothing giving you the right to visit a website

Except for the fact that it is being purposefully made available to the general public.

next

Legal | privacy