←back to thread

707 points patd | 5 comments | | HN request time: 0s | source
Show context
Traster ◴[] No.23322571[source]
I think this is going to be a discussion thread that is almost inevitably going to be a shitshow, but anyway:

There are people who advocate the idea that private companies should be compelled to distribute hate speech, dangerously factually incorrect information and harassment under the concept that free speech is should be applied universally rather than just to government. I don't agree, I think it's a vast over-reach and almost unachievable to have both perfect free speech on these platforms and actually run them as a viable business.

But let's lay that aside, those people who make the argument claim to be adhering to an even stronger dedication to free speech. Surely, it's clear here that having the actual head of the US government threatening to shut down private companies for how they choose to manage their platforms is a far more disturbing and direct threat against free speech even in the narrowest sense.

replies(42): >>23322601 #>>23322660 #>>23322889 #>>23322983 #>>23323095 #>>23323271 #>>23325355 #>>23327443 #>>23327459 #>>23327625 #>>23327899 #>>23327986 #>>23328982 #>>23329094 #>>23329143 #>>23329230 #>>23329237 #>>23329375 #>>23329616 #>>23329658 #>>23329911 #>>23330257 #>>23330267 #>>23330422 #>>23330438 #>>23330441 #>>23331115 #>>23331430 #>>23331436 #>>23331462 #>>23331469 #>>23331944 #>>23332090 #>>23332213 #>>23332505 #>>23332858 #>>23332905 #>>23332934 #>>23332983 #>>23333360 #>>23341099 #>>23346876 #
kgin ◴[] No.23328982[source]
I think it's even more concerning than that.

Threatening to shut down private companies -- not for limiting speech, not for refusing to distribute speech -- but for exercising their own right to free speech alongside the free speech of others (in this case the president).

There is no right to unchallenged or un-responded-to speech, regardless of how you interpret the right to free speech.

replies(4): >>23329367 #>>23329735 #>>23331811 #>>23333632 #
mc32 ◴[] No.23329735[source]
Attaching a disclaimer to the speech of another though is not straightforward. Will they get into the business of fact checking everyone over certain number of followers? Will they do it impartially world-wide? How can they even be impartial world wide given the different contradictory points of view, valid from both sides? Cyprus? What’s the take there?
replies(14): >>23330175 #>>23330344 #>>23330620 #>>23330747 #>>23330844 #>>23330867 #>>23331723 #>>23332140 #>>23332537 #>>23332697 #>>23332814 #>>23333088 #>>23333519 #>>23333921 #
eanzenberg ◴[] No.23330344[source]
As they move into a “publisher” role, they will be liable in count.
replies(1): >>23330801 #
root_axis ◴[] No.23330801[source]
You're wrong. Stop spreading misinformation.

> No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider" (47 U.S.C. § 230)

replies(2): >>23330911 #>>23331591 #
1. belorn ◴[] No.23331591[source]
https://en.wikipedia.org/wiki/Section_230_of_the_Communicati...

To sum up: If the platform becomes the "information content provider", defined as "any person or entity that is responsible, in whole or in part, for the creation or development of information", then they loose the protection. The statute also excepts federal criminal liability and intellectual property claims.

Creation or development of information can exclusively be moderation, as has been shown in copyright cases. Cutting (deciding what to show and what not to show), re-arrange or changing the context can create new original work, which would make the creator an information content provider for that. At the same time, doing either of those does not automatically cause the moderator to become a creator of original work.

As lawyers like to say, it all depends on the details of the specific case. To take a extreme example outside of this twitter discussion, taking an video interview and cutting it to create a new narrative would make the editor responsible for that whole new version.

replies(1): >>23334390 #
2. joshuamorton ◴[] No.23334390[source]
> can exclusively be moderation

But not for the purposes of section 230. And there's significant precedent to this effect that you'd need a supreme Court ruling to change it.

replies(1): >>23335817 #
3. belorn ◴[] No.23335817[source]
Feel free to link to the supreme court ruling that has a precedent which proves that creating new derivative works does not result in the author becoming an information content provider.

To take an fictional twitter example, blocking a user from a website is unlikely to create a derivative work. Removing a post in the middle of a twitter chain that makes up a story could change the narrative and content of that story, and if done intentionally would create a derivative work. The user could then sue twitter for copyright infringement, and if the new story is defamatory, under liability laws. We could for example imagine a rape story where the post that includeded the word "Stop" was removed, where the author would then have a legit legal claim against the moderation.

It all depend on context, intent, and the details of a specific case. The tools of moderation does not define what is legal and what is not.

replies(1): >>23338828 #
4. joshuamorton ◴[] No.23338828{3}[source]
It comes down to intent. If the intent of moderation is "taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected", section 230 provides immunity. "Otherwise objectionable" is very, very broad.
replies(1): >>23347799 #
5. belorn ◴[] No.23347799{4}[source]
To that I 100% agree. if the intent of an moderation is only to restrict access to or availability of material for those reasons, then that is likely not a derivative work.