←back to thread

707 points patd | 5 comments | | HN request time: 0.001s | source
Show context
falcolas ◴[] No.23322696[source]
Free speech is not just an American constitutional right; many countries throughout the world consider free speech to be a human right.

So, yeah, many of us get a bit worked up when people are kicked off platforms, because they are being silenced, sometimes to the point of being shut out of the modern internet entirely (when their rights to a DNS address are comprehensively removed).

Hate speech and lies are terrible, but they’re not the only thing being silenced.

replies(5): >>23322837 #>>23322861 #>>23322910 #>>23327959 #>>23329690 #
Traster ◴[] No.23322910[source]
Okay, so I think there's some nuance there, I think there's a pragmatic line to draw - I don't think someone has a right to say anything on twitter, I just don't think that's twitters role is to be neutral. But I think there's a line where we go from a product that's curated and moderated - something like twitter, to something that is truly infrastructure. The DNS example is great, I don't think a DNS company should be able to refuse to service based on the content that's being served because the role of the DNS is simply to resolve a name to an address. What's served on that address is immaterial. I think we draw a bright line between those two types of things, although I'm sure it's more difficult than that when we're trying to design a law.
replies(4): >>23323827 #>>23324451 #>>23327978 #>>23328614 #
RandomTisk ◴[] No.23323827[source]
Then Twitter has to lose their protections as a 'carrier' and become a publisher with all the regulation that goes along with being a publisher.
replies(1): >>23324345 #
1. Traster ◴[] No.23324345[source]
No they don't. People seem to have this idea that either you should be liable for nothing and control nothing or liable for everything control everything. The point of these platforms is that whilst they're allowing users to post under limited conditions, they don't have any pre-publication editorial control. That is a material difference from a publisher. They also aren't totally agnostic to content (like a DNS service). This attempt to hold user-generated content to the same standard as news organisations is clearly ridiculous and I don't know why people keep trying to apply it. It's a great way of ensuring that no level of regulation will ever be applied - since the suggested level of regulation completely destroys the business model of several hundred billion dollar businesses.
replies(1): >>23324581 #
2. falcolas ◴[] No.23324581[source]
This concept is already enshrined in law, the concept of free harbor. So long as a service provider doesn't do their own curation, they are not held responsible for the content that is posted. However, if they do curate, then they are responsible.

Applying this to Twitter, Facebook et al. is not that big of a leap.

> completely destroys the business model of several hundred billion dollar businesses

They are not entitled to their business model, especially not at the price of trampling upon something broadly considered to be an inherit human right.

replies(1): >>23328193 #
3. three_seagrass ◴[] No.23328193[source]
>So long as a service provider doesn't do their own curation, they are not held responsible for the content that is posted.

Except they are held responsible if they don't curate. Look at laws like SESTA to see how platforms that don't self-curate content that could sexualize minors are legally liable.

I'm not saying SESTA is bad, I'm saying this idea that platforms need to be hands-off towards curation to maintain safe harbor protection is not true.

replies(1): >>23329339 #
4. falcolas ◴[] No.23329339{3}[source]
You’re conflating removing illegal content with removing legal content that someone doesn’t like.
replies(1): >>23330891 #
5. three_seagrass ◴[] No.23330891{4}[source]
Which is exactly the point, that platforms who do not self-curate some types of user-generated content are not protected by safe harbor laws.

Your idea that safe-harbor laws only apply to platforms who don't self-curate is absurd precisely because there is illegal content they, the platforms, can be held liable for instead of the users.