Most active commenters
  • hadtodoit(4)

←back to thread

707 points patd | 12 comments | | HN request time: 0.452s | source | bottom
Show context
Traster ◴[] No.23322571[source]
I think this is going to be a discussion thread that is almost inevitably going to be a shitshow, but anyway:

There are people who advocate the idea that private companies should be compelled to distribute hate speech, dangerously factually incorrect information and harassment under the concept that free speech is should be applied universally rather than just to government. I don't agree, I think it's a vast over-reach and almost unachievable to have both perfect free speech on these platforms and actually run them as a viable business.

But let's lay that aside, those people who make the argument claim to be adhering to an even stronger dedication to free speech. Surely, it's clear here that having the actual head of the US government threatening to shut down private companies for how they choose to manage their platforms is a far more disturbing and direct threat against free speech even in the narrowest sense.

replies(42): >>23322601 #>>23322660 #>>23322889 #>>23322983 #>>23323095 #>>23323271 #>>23325355 #>>23327443 #>>23327459 #>>23327625 #>>23327899 #>>23327986 #>>23328982 #>>23329094 #>>23329143 #>>23329230 #>>23329237 #>>23329375 #>>23329616 #>>23329658 #>>23329911 #>>23330257 #>>23330267 #>>23330422 #>>23330438 #>>23330441 #>>23331115 #>>23331430 #>>23331436 #>>23331462 #>>23331469 #>>23331944 #>>23332090 #>>23332213 #>>23332505 #>>23332858 #>>23332905 #>>23332934 #>>23332983 #>>23333360 #>>23341099 #>>23346876 #
1. hadtodoit ◴[] No.23329094[source]
If companies are going to self-moderate their platforms then they should not receive any kind of legal protection from user-generated content. I wholly believe companies have every right to dictate what is on their platform but they cannot have it both ways. If you can afford to moderate content you disagree with, you can do so for illegal content as well.

If I own a store and someone injures themselves on the premises I am held liable for that. I did not force that person to enter the store but the benefits of having a store outweighed the risks. Why should internet companies receive special treatment? They should be 100% liable for what happens on their "premises" if they are going to take the risk of allowing user-generated content.

replies(6): >>23329175 #>>23329190 #>>23329219 #>>23329418 #>>23330218 #>>23349070 #
2. tootie ◴[] No.23329175[source]
Do they have protection right now? Platforms are already held responsible for illegal activities and are subject to requests by law enforcement and copyright holders. They're generally given a chance to respond to a request, challenge requests through channels and listen to appeals. But they would eventually be culpable if they weren't compliant.
replies(1): >>23329261 #
3. azinman2 ◴[] No.23329190[source]
Because the scale makes this nearly impossible. Or rather, extremely expensive to the point where only the biggest of companies can do so, and at the cost of real-time information.
replies(1): >>23329325 #
4. bredren ◴[] No.23329219[source]
This presumes equal weight of all content. Some content gets far more attention and thus must face a higher degree of scrutiny. This is the only way to curate at scale.

Apple does this with the App Store, where it is possible to get away with breaking app store rules if the app is not downloaded very often. It is not worth the time and energy for Apple to challenge apps that no one is downloading in the first place.

On twitter, with regard to illegal content it also has to matter the degree. How illegal / and reprehensible is it? How often is this tweet being requested?

replies(1): >>23329410 #
5. hadtodoit ◴[] No.23329261[source]
Sounds like you're referring to DMCA where as I was referring to heinous crimes like drug/sex/child trafficking. They do have protection right now in either case.
replies(1): >>23329465 #
6. hadtodoit ◴[] No.23329325[source]
How often are people bootstrapping a social media site? That's not something you rollout with a tight budget. Most websites do not allow user-generated content. This won't have nearly as big of an effect on the sector as you think.

Whether they are paying people or writing automated systems to remove content they disagree with, these companies argued for this legal protection on the grounds of protecting free-speech, and now that they want to restrict it they don't deserve those same protections.

7. hadtodoit ◴[] No.23329410[source]
Twitter has some automated method of determining whether a tweet is NSFW and it is very accurate to the point where I didn't even realize they allowed that content. They can figure out how to filter illegal content as well.
replies(1): >>23331783 #
8. RubberSoul ◴[] No.23329418[source]
Store owners, at least in the US, are not 100% liable for injuries on their property. Their liability depends on several factors, which include the reasonableness of their behavior and the behavior of the visitor.
9. tootie ◴[] No.23329465{3}[source]
I'm referring to both. But in both cases, the content hosts don't get punished instantly. They are served with notice of offending content and given a chance to comply. The host and the creator both have avenues of appeal. At least, in the US they do. It varies country to country.
10. beart ◴[] No.23330218[source]
If you take that to the extreme, then someone running a forum for young kids would not be allowed to remove pornographic material, lest they be held liable for all other inappropriate content that gets posted.
11. bredren ◴[] No.23331783{3}[source]
I believe this is the basis for conservative opinion on this. The trouble is, even offline there is no universal 'filter' for illegality.

Law enforcement must also work at scale, and focus on illegal behavior that is having the most impact.

When a court finds this power is used improperly, such as the arrest of Stormy Daniels in Columbus, Ohio, there are penalties.

For something like this to stand, I believe conservatives will have to prove major examples conservative bias. Unfortunately, the tweets in question so far will not be great evidence of that.

12. tzs ◴[] No.23349070[source]
> If companies are going to self-moderate their platforms then they should not receive any kind of legal protection from user-generated content. I wholly believe companies have every right to dictate what is on their platform but they cannot have it both ways. If you can afford to moderate content you disagree with, you can do so for illegal content as well.

So if I run a chess forum and disallow posts that are not related to chess, your belief is that if one of my users posts a libelous statement about another user's alleged conduct during a chess game at a tournament in their city, I should be on the hook for the first user's post?

If I can afford to spend maybe 20 minutes a day reviewing all posts that keyword-based scanning suggest might not be about chess, I should have been able to fly to the city that tournament was in and conduct an investigation to determine if what the user said was true before allowing the post to stay up on my forum?