Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login

In what sense is Facebook not a publisher? Their algorithm acts as an editor, choosing what to show me. If they had a simple chronological feed, then the platform argument would make sense.

If the NYT created a service where the articles I see were selected algorithmically, would they suddenly not be a publisher?



view as:

The "publisher" versus "platform" distinction is 100% a made-up distinction to motivate bad §230 takes.

What §230 does, very simply, is say that websites posting user-generated content are not liable for that content, even if they moderate the content. It was passed in response to a pair of court decisions that concluded that a website that moderated content (including, for example, weeding out profanity or pornography) was liable for all content posted, and a website that provided no moderation whatsoever wasn't liable.


Thanks for clarification.

I finally looked up the actual text of §230 and it says this:

> No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

So I guess the NYT would be responsible for articles they generate but I suppose they would get a pass for anything they re-publish (like from a wire service).


> So I guess the NYT would be responsible for articles they generate but I suppose they would get a pass for anything they re-publish (like from a wire service).

Exactly. Though, interestingly, the second part of your statement is only true for the online edition. For NYT-on-paper, they're liable for all of it. The same with the comments section: online, it's covered by s.230; offline, the 'letters to the editor' section in print is the responsibility of the paper.


Because s.230 expressly provides for them not to be treated as one. The worry at the time was that information services making editorial decisions (taking down harmful content, in particular) would be treated as publishers, and so liable for what was left up. That creates an obvious moral hazard problem, encouraging bulletin boards and web hosts to refuse to even look at what's being posted, to avoid liability. So s.230 was added to the Communications Decency Act to make clear that the legal responsibility would fall only on those originally providing the information.

This situation isn't mirrored outside the US, FWIW. IIRC England & Wales will impose liability for libels etc., but only if the host had actual or constructive knowledge of the content of the post and chose to let it stay up. That introduces quite a lot of legal uncertainty and a bias towards deleting controversial material but may be better overall. I don't really know.


Legal | privacy