Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

A minimal level of protection against lawsuits like the one that just killed Omegle.


sort by: page size:

Thanks for the info. Interesting, I can see the merit in having broad legal protection to stop people from malicious activity. But that does seem a little too broad.

If the goal is to avoid capricious removal of content then this is great. If the goal is to avoid the platform being used for harm then this seems insufficient. The law cannot move fast enough to stay current with social trends.

The same one that allows them to sue people for scraping their website.

Tech laws are absurd.


Just eliminate the various liability shields that have been enacted for third-party content. In certain areas (e.g. the personals website crackdown) we are moving in this direction already.

If you make a search engine vulnerable to a libel lawsuit because, for example, their search results make it look like Joe So-and-So was arrested for DUI (when it was actually Joe So-and-Sew or some such thing), they'll just stop indexing that stuff entirely.

Best to avoid creating new laws where old legal concepts will work fine.


Since most if not all of these platforms are based in countries more friendly towards infringement it’s really not going to do much of anything other than be another law that gets selectively applied and abused.

It's a pretty essential law for maintaining the internet as we know it. If providers could be sued for the content, provided by others, that passes through their services then we can't really have an open internet. ISPs would be afraid to allow anyone to connect because they could be sued for what their costumers do. Search engines would be afraid to index sites because they could be sued for any misinformation on them. Github would be afraid to host projects because they could be sued if any of them violates copyright. It would be a very different internet.

I wish they would address real problems like arbitrary enforcement of TOS that damages users with no recourse. There’s no regulation for what happens when Google or Facebook or whatever deletes or blocks an account and provides no support.

Child abuse is serious, but not that common and this law will do little to change that. The lack of a UCC-style law for big tech platforms affects way more people.

This seems like BS that will squelch small players that can’t afford to comply. And consolidate more power into a few small firms.


Jesus that's so repressive. And I'm sure the laws are pretty subjective as well or at least not something the average chat admin is fully up to speed with. Essentially the outcome of this will be to kill chats or send it underground to tor or something.

I am not sure why it ever survived. Every other piece of that law was dismantled. At the very minimum, it needs to be reformed so that Big Tech that takes advantage of these protections lose them with evidence of clear abuse, bias, or lack of promoting/allowing opposing viewpoints.

The problem here is that it isn't attempting to prevent harm. The whole purpose for it is that it gives the government more power over people online. Harm prevention is just a front.

Why would this necessarily mean they're getting rid of all legal immunity? I think we have plenty of examples outside of the digital world where legal immunity is maintained.

The digital world currently has an excessively powerful version of this -- even when they're not neutral, they're still immune, which definitely needs to be fixed.


More like "stop a bill from being passed into law which may shut down us and the rest of the internet as you know it" - do you have any idea how much labor it would take to be a search engine, video site or social networking site if you were liable for the legal status of all user-created content? Practically every site you use would go under, including the one we're on right now.

not sure how that would stop misuse and abuse on the internet.

not like the bad guys are suddenly going to respect the law.


While I want the web to be maximally accessible to all individuals it seems like these lawsuit have had a chilling effect on things like opencourseware and open access to university courses.

The purpose of the law is clear though: it recognizes the need of platforms to moderate problematic content and thus provides immunity when platforms moderate such content - effectively, it's an incentive for platforms to provide moderation.

Even the US has legal mandates to take down certain contents in any case - CSAM, IP violations (DMCA) and terrorist content come to my mind.


I think we have to be careful about making companies liable for everything their users upload. While companies like Google and Facebook have the computational capacity to scan everything that they consume, what about startups?

It shouldn't mean companies aren't liable for failing to remove content deemed illegal, but having it be illegal for any single thing to slip between the cracks seems harsh.


Your Client could stop visiting those sites, cuz they would disappear anyway if this law was repelled.

It probably means changing laws to allow users to hold internet platforms liable for harms caused. All parties can then have their day(s) in court. That seems fair to me.

I really can't be bothered to defend platforms like Reddit or Facebook from directives or laws, but what I do care and hold dear is the user experience that will suffer greatly if the directives pass as proposed. In the end, it's the users who will bear the brunt of legislation like this by losing access and tools to communicate, and that is the crux of this issue, legislation rarely does what it's primary intention is and ends up harming the public the most. After all, laws are very blunt objects to deal with soft matters because they are enforced by fines and penalties and weakest members, usually individuals, are least prepared to defend their rights, companies can hire lawyers or even shift focus to something else. Users will lose everything.
next

Legal | privacy