←back to thread

45 points gmays | 1 comments | | HN request time: 0.205s | source
Show context
throwup238 ◴[] No.41916343[source]
> Sarcasm, cultural context and subtle forms of hate speech often slip through the cracks of even the most sophisticated algorithms.

I don't know how this problem can be solved automatically without something that looks a lot like AGI and can monitor the whole internet to learn the evolving cultural context. AI moderation feels like self driving cars all over again: the happy path of detecting and censoring a dick pic - or self driving in perfect California weather - is relatively easy but automating the last 20% or so of content seems impossibly out of reach.

The "subtle forms of hate speech" is especially hard and nebulous, as dog whistles in niche communities change adversarialy to get past moderation. In the most subtle of cases, there are a lot of judgement calls to make. Then each instance of these AGIs would have to be run in and tailored to local jurisdictions and cultures because that is its own can of worms. I just don't see tech replacing humans in this unfortunate role, only augmenting their abilities.

> The glossy veneer of the tech industry conceals a raw, human reality that spans the globe. From the outskirts of Nairobi to the crowded apartments of Manila, from Syrian refugee communities in Lebanon to the immigrant communities in Germany and the call centers of Casablanca, a vast network of unseen workers power our digital world.

This part never really changed. Mechanical turk is almost 20 years old at this point and call center outsourcing is hardly new. What's new is just how much human-generated garbage we force them to sift through on our behalf. I wish there was a way to force these training data and moderation companies to provide proper mental health care .

replies(8): >>41916410 #>>41916493 #>>41916524 #>>41916596 #>>41916819 #>>41917288 #>>41917660 #>>41917936 #
hcurtiss ◴[] No.41916819[source]
I think there's a genuine conversation to be had about whether there even is such a thing as "hate speech." There's certainly "offensive speech," but if that's what we're going to try to eliminate, then it seems we'll have a bad time as the offense is definitionally subjective.
replies(5): >>41916885 #>>41916918 #>>41917083 #>>41917089 #>>41917466 #
danans ◴[] No.41917466[source]
> I think there's a genuine conversation to be had about whether there even is such a thing as "hate speech."

It may be fuzzy on the far edges, but any speech that calls for the elimination, marginalizes, dehumanizes or denies human or civil rights of a group of people is right in the heart of the meaning of hate speech.

That definition still leaves huge amounts of space for satire, comedy, political and other forms of protected speech, even "offensive speech".

replies(2): >>41917666 #>>41918399 #
skeeter2020 ◴[] No.41918399[source]
>> the elimination, marginalizes, dehumanizes or denies human or civil rights

but you've already lumped together a huge range of behaviours and impacts. Elimination? OK, we can probably broadly define that, but I just heard news reports with quotes of Israelis calling for the elimination of Hamas, and Iran the elimination of Israel. How do we handle that? marginalized? as defined by who? What about marginalizing undesirable behaviours or speech? What does "dehumanize" mean? Who's definition of human or civil rights?

replies(1): >>41919489 #
1. danans ◴[] No.41919489[source]
> Elimination? OK, we can probably broadly define that, but I just heard news reports with quotes of Israelis calling for the elimination of Hamas, and Iran the elimination of Israel. How do we handle that? marginalized? as defined by who?

I'd call both of them hate speech without qualification. But between countries, there's no legal system that would rule on speech (only actions, like the ICJ tries to adjudicate).

> What about marginalizing undesirable behaviours or speech?

What is the example of undesirable behavior being undertaken by a group that would warrant their marginalization as a group? I'm having a hard time finding an example of that. Calling out a group of racists or bigots (based on their words) for what they are isn't marginalization.

> What does "dehumanize" mean?

This has a very straightforward definition:

https://www.merriam-webster.com/dictionary/dehumanize

> Who's definition of human or civil rights?

In the US context, this is also well defined:

https://www.findlaw.com/civilrights/civil-rights-overview/wh...