←back to thread

560 points whatsupdog | 4 comments | | HN request time: 1.305s | source
Show context
Fricken ◴[] No.45173525[source]
[flagged]
replies(3): >>45173713 #>>45175697 #>>45179471 #
fluoridation ◴[] No.45173713[source]
Why are you dancing around the issue instead of answering the question? What was the point of making that comment? "Giving a history lesson" is simply bullshit. If you want to say that you're pro-censorship then just say it.
replies(1): >>45173792 #
Fricken[dead post] ◴[] No.45173792[source]
[flagged]
fluoridation ◴[] No.45173866[source]
If the price of free speech is that racists have free speech, then I'm willing to pay it.
replies(1): >>45173906 #
1. fluoridation ◴[] No.45174039[source]
Interesting. So do you think if Facebook hadn't existed the obvious racial tensions also wouldn't have existed?
replies(1): >>45175118 #
2. fluoridation ◴[] No.45175236{3}[source]
Not to the point of one group wanting to enact genocide on another, so that's completely irrelevant, and you know it. If Facebook hadn't existed, would the genocide still have happened? Since you seem to be in a deflecting mood, I'm going to answer for you: yes. If you have one group that hates another, removing a technological means of communication solves nothing.
replies(1): >>45176076 #
3. Fricken ◴[] No.45176076{4}[source]
The Rohingya claim they've been living in the region for over a millenium, and presumably they were able to go the whole way without getting genocided, up until Facebook arrived, that is.

>For years, Facebook, now called Meta Platforms Inc., pushed the narrative that it was a neutral platform in Myanmar that was misused by malicious people, and that despite its efforts to remove violent and hateful material, it unfortunately fell short. That narrative echoes its response to the role it has played in other conflicts around the world, whether the 2020 election in the U.S. or hate speech in India.

But a new and comprehensive report by Amnesty International states that Facebook’s preferred narrative is false. The platform, Amnesty says, wasn’t merely a passive site with insufficient content moderation. Instead, Meta’s algorithms “proactively amplified and promoted content” on Facebook, which incited violent hatred against the Rohingya beginning as early as 2012.

Despite years of warnings, Amnesty found, the company not only failed to remove violent hate speech and disinformation against the Rohingya, it actively spread and amplified it until it culminated in the 2017 massacre.

https://www.pbs.org/newshour/world/amnesty-report-finds-face...

Here's the report:

https://www.amnesty.org/en/documents/asa16/5933/2022/en/

replies(1): >>45176499 #
4. fluoridation ◴[] No.45176499{5}[source]
None of that refutes what I said.