Most active commenters
  • mewpmewp2(6)
  • drivebyhooting(4)
  • shadowgovt(3)

←back to thread

45 points gmays | 43 comments | | HN request time: 1.044s | source | bottom
1. throwup238 ◴[] No.41916343[source]
> Sarcasm, cultural context and subtle forms of hate speech often slip through the cracks of even the most sophisticated algorithms.

I don't know how this problem can be solved automatically without something that looks a lot like AGI and can monitor the whole internet to learn the evolving cultural context. AI moderation feels like self driving cars all over again: the happy path of detecting and censoring a dick pic - or self driving in perfect California weather - is relatively easy but automating the last 20% or so of content seems impossibly out of reach.

The "subtle forms of hate speech" is especially hard and nebulous, as dog whistles in niche communities change adversarialy to get past moderation. In the most subtle of cases, there are a lot of judgement calls to make. Then each instance of these AGIs would have to be run in and tailored to local jurisdictions and cultures because that is its own can of worms. I just don't see tech replacing humans in this unfortunate role, only augmenting their abilities.

> The glossy veneer of the tech industry conceals a raw, human reality that spans the globe. From the outskirts of Nairobi to the crowded apartments of Manila, from Syrian refugee communities in Lebanon to the immigrant communities in Germany and the call centers of Casablanca, a vast network of unseen workers power our digital world.

This part never really changed. Mechanical turk is almost 20 years old at this point and call center outsourcing is hardly new. What's new is just how much human-generated garbage we force them to sift through on our behalf. I wish there was a way to force these training data and moderation companies to provide proper mental health care .

replies(8): >>41916410 #>>41916493 #>>41916524 #>>41916596 #>>41916819 #>>41917288 #>>41917660 #>>41917936 #
2. datadrivenangel ◴[] No.41916410[source]
There's also the issue of things that are true and mean/hateful.

If my GP says that I'm overweight, which is associated with negative health outcomes, that's factual. If someone on twitter calls me a fatso, that's mean/hateful.

3. drivebyhooting ◴[] No.41916493[source]
What is a dog whistle? Is it just an opinion people disagree with and so rather than engage with it they assume malice or ill intent?

I really don’t get it.

replies(4): >>41916512 #>>41916517 #>>41916570 #>>41917697 #
4. ◴[] No.41916512[source]
5. kevingadd ◴[] No.41916517[source]
https://en.wikipedia.org/wiki/Dog_whistle_(politics)

To simplify, dog whistles make a sound that's too high pitched for most humans to hear, but only dogs can hear it.

So it's speech that the speaker's ingroup recognizes as meaning something other than what the literal interpretation would mean. It's coded speech, usually for racist, sexist or even violent purposes.

An adjacent concept is giving orders without giving orders, i.e. https://en.wikipedia.org/wiki/Will_no_one_rid_me_of_this_tur...

6. whiplash451 ◴[] No.41916524[source]
The difference between adult material detection and self driving is that the former is fundamentally adversarial.

Humans will spend a lot of energy to hide porn content on the internet while self-driving might benefit from a virtuous circle: once enough waymos are out there, people will adapt and learn to drive/bike/walk alongside them. We have a fundamentally good reason to cooperate.

I am not a self-driving fanatic but I do believe that a lot of edge cases might go away as we adapt to them.

replies(1): >>41917128 #
7. drivebyhooting ◴[] No.41916570[source]
Yes I too can google. And in fact I did. Evidently questioning the validity of dog whistle is a dog whistle itself.

Meanwhile a small consolation is that https://slatestarcodex.com/2016/06/17/against-dog-whistles/ agrees with me. So I’m in decent company.

replies(2): >>41916822 #>>41917419 #
8. hn_throwaway_99 ◴[] No.41916596[source]
> In the most subtle of cases, there are a lot of judgement calls to make.

IMO there is the even more important point that beyond being a "judgement call", humans are far from being in agreement with what the "right answer" is here - it is inherently an impossible problem to solve, especially at the edge cases.

Just look at the current debate in the US. There are tons of people screeching from the right that large online social networks and platforms censor conservative views, and similarly there are tons of people screeching from the left about misinformation and hate speech. In many cases they are talking about the exact same instances. It is quite literally a no-win situation.

9. hcurtiss ◴[] No.41916819[source]
I think there's a genuine conversation to be had about whether there even is such a thing as "hate speech." There's certainly "offensive speech," but if that's what we're going to try to eliminate, then it seems we'll have a bad time as the offense is definitionally subjective.
replies(5): >>41916885 #>>41916918 #>>41917083 #>>41917089 #>>41917466 #
10. Lerc ◴[] No.41916822{3}[source]
Ever watch the West Wing? The first episode should do it.

One of the signs of dog whistle use is the situations in which the term is used that raises probability beyond the credible level of coincidence.

replies(1): >>41917040 #
11. yifanl ◴[] No.41916885[source]
Is the claim there some special property that makes it impossible to convey hate as opposed any other type of idea through text?

That seems extremely wrong, especially in this context, given that LLMs make no attempt to formalize "ideas", they're only interested in syntax.

replies(1): >>41917277 #
12. szundi ◴[] No.41916918[source]
There is hate speech, like when someone tells poeple how other people are not human and must be eliminated. Happened a lot, happening now in wars you read about.
replies(1): >>41917406 #
13. drivebyhooting ◴[] No.41917040{4}[source]
I don’t watch TV.

To my mind, this dog whistle moniker is more of a tool for suppressing dissenting views than identifying covert bigotry.

Apparently all the critical thinking has already been done off stage and now only those whom we agree with are tolerated. The others are shunned as racists or worse.

replies(3): >>41917346 #>>41917497 #>>41918027 #
14. o11c ◴[] No.41917083[source]
I'm not sure "offensive" is actually subjective. Rather, I dare say it's morally obligatory to be offensive at times, but different communities put the line in different places.

Stating the position "torture is bad" is enough to get you banned from some places (because it's offensive to people who believe that it's okay as long as the victims are less-than-human).

15. mewpmewp2 ◴[] No.41917089[source]
There is a definition for hate speech though.
replies(2): >>41917142 #>>41918427 #
16. nradov ◴[] No.41917128[source]
Animals, small children, and random objects dropped on the road will never "adapt" to self-driving. Good enough solutions will eventually be found for those scenarios but it's exactly those millions of different edge cases which make the problem so hard. A step ladder that falls off a work truck (like I saw on the freeway yesterday) isn't exactly "adversarial" but it will sure ruin your car if you run over it.
replies(1): >>41917451 #
17. reginald78 ◴[] No.41917142{3}[source]
Actually, I think the problem is there are many definitions of hate speech.
replies(1): >>41917213 #
18. mewpmewp2 ◴[] No.41917213{4}[source]
I think there's only 1 main definition. Which is clear in spirit, but it's of course possible that people may interpret the definition definitely.
replies(1): >>41917439 #
19. mewpmewp2 ◴[] No.41917277{3}[source]
Maybe the name for the "hate speech" is poorly chosen, since it's not necessarily about "hate".
replies(1): >>41917572 #
20. tomjen3 ◴[] No.41917288[source]
By definition a dogwhistle appears to not mean anything specific to anyone but the target group. So even human moderators can't moderate it.
21. neaden ◴[] No.41917346{5}[source]
This is a world where Donald Trump, the man who calls his political opponent mentally disabled among other things is a serious front runner for the highest office in the nation. I have no idea how you can look at contemporary America and say people aren't allowed to say offensive things.
22. epicureanideal ◴[] No.41917406{3}[source]
But when "hate speech" becomes censorable and a crime, then people are incentivized to interpret as broadly as possible their opponents' statements and claim they should be interpreted as dehumanizing or encouraging violence.

This can be done from both sides. Examples:

Not sufficiently (for whoever) enforcing immigration laws? "Trying to eliminate the majority population, gradual ethnic cleansing".

Talking about deporting illegal immigrants? "The first step on the road to murdering people they don't want in the country."

And if the local judiciary or law enforcement is aligned with the interests of one side or the other, they can stretch the anti hate speech laws to use the legal system against their opponents.

replies(1): >>41918417 #
23. arp242 ◴[] No.41917419{3}[source]
> Evidently questioning the validity of dog whistle is a dog whistle itself.

I don't see anyone saying that.

24. jacobr1 ◴[] No.41917439{5}[source]
That may be so, but the denigrate (and common case as this thread suggests) is to expand the notion to any offensive speech that is disliked by the offended person. That is much more subjective and hard to define. The fact that we have some (better) definitions doesn't really help. The desire to censure speech is widespread, for different reasons, many conflicting. And the fact that there might be a rough academic consensus on where to draw lines (at least theoretically if not practically) isn't good enough in practice to actually define clear rules.
replies(1): >>41919422 #
25. shadowgovt ◴[] No.41917451{3}[source]
Animals, small children, and random objects dropped on the road don't adapt to human driving either; they aren't generally considered the core concern space (in the sense that if it is physically possible for a self-driving car to do better than a human in those contexts, it will, but the project isn't designed around doing better than a human in such corner cases. Doing worse than a human is not acceptable).
26. danans ◴[] No.41917466[source]
> I think there's a genuine conversation to be had about whether there even is such a thing as "hate speech."

It may be fuzzy on the far edges, but any speech that calls for the elimination, marginalizes, dehumanizes or denies human or civil rights of a group of people is right in the heart of the meaning of hate speech.

That definition still leaves huge amounts of space for satire, comedy, political and other forms of protected speech, even "offensive speech".

replies(2): >>41917666 #>>41918399 #
27. shadowgovt ◴[] No.41917497{5}[source]
> Apparently all the critical thinking has already been done off stage

In general, yes: there is a long history of conversation on various topics, actions that have caused trust levels to be preset among various groups, and meta-symbols constructed atop that information. Those new to the conversation may be unaware of the context.

> and now only those whom we agree with are tolerated

I'm not sure who "we" is in that context. In the US, currently, the polity is very divided because sevaral key events have, in a sense, caused "mask off" to occur in the mainstream of both political parties that makes it difficult for anyone to believe one of them is willing to share power.

(as a side note: rhetorical questions don't usually convey well through text media. If you didn't literally mean "I really don't get it" when you said you didn't get it, making clear you are being rhetorical could be considered polite).

replies(1): >>41917634 #
28. yifanl ◴[] No.41917572{4}[source]
I mean, what's the claim then, there's no such thing as an illegal idea? You can't assign a semantic value to a legal system.
29. drivebyhooting ◴[] No.41917634{6}[source]
“Those new to the conversation” I find this hilariously un-inclusive. Do you expect young adults to come out of school with the correct doctrine in place so that no further thinking or discourse is necessary?

Perhaps the nation’s division is evidence of the lack of genuine sharing of ideas? Where would one go to have an intellectual discussion in safety? Workplace? Obviously not. Online forum? Downvotes, brigadding, and generally lack of tolerance.

Small wonder that I’m not being persuaded and neither are you.

replies(1): >>41917743 #
30. HPsquared ◴[] No.41917660[source]
This adversarial thing could be called the "two words problem"... After that woman who was arrested in Russia by human moderators for holding a sign that said "two words" (an indirect reference to the phrase "no war").
31. samatman ◴[] No.41917666{3}[source]
> the elimination

Yep, that's bad alright.

> marginalizes, dehumanizes

This is the part which means anything that authorities or other powerful groups need it to.

32. samatman ◴[] No.41917697[source]
Dog whistling is when a politician from the outgroup says something, and the ingroup wants him to have said something else, so they say "dog whistle" and put the offending meaning in his mouth as a replacement for what was actually said.
33. shadowgovt ◴[] No.41917743{7}[source]
> Do you expect young adults to come out of school with the correct doctrine in place so that no further thinking or discourse is necessary?

Definitely not. I do expect them to listen before speaking out. It was a hard lesson I myself had to learn when I was one of those young adults coming out of school. Sometimes, conventional wisdom is just accrued prejudice. Sometimes it is accrued experience and people are as they are for a reason. It's probably best to have enough information to know before staking a position openly and pushing other people off their own.

> Where would one go to have an intellectual discussion in safety?

Traditionally? The bar. I'm not even kidding. This is the kind of thing people discuss face-to-face most effectively. We do less of that these days.

34. ben_w ◴[] No.41917936[source]
I get the point, and agree that the opponents are in an adversarial relationship with the AI, but LLMs are already an AI that "can monitor the whole internet to learn the evolving cultural context".

Consider for example that ChatGPT wasn't specifically designed to be good at programming Commodore 64 basic, which is a niche within a niche, but it can do that fine even when instructed in Welsh*, and if it can do that then surely it can spot these things too?

> In the most subtle of cases, there are a lot of judgement calls to make.

I agree; while they know a lot, they know it poorly, and make decisions unwisely.

> I wish there was a way to force these training data and moderation companies to provide proper mental health care .

Good news, there is. An old flame used to work in a call center, ended up unionising the place.

Bad news (from the POV of many here): she's literally a communist — and that's not a metaphor for "Democrats", she thinks the Dems are evil neoliberals.

* I've not actually tried running this, because on a related note, can anyone recommend an emulator that will let me paste in text as if I was typing the content of the pasteboard on the keyboard?

https://chatgpt.com/share/6717ff4e-db08-8011-8f2c-a33fa9653a...

35. thefaux ◴[] No.41918027{5}[source]
I'm not here to change your mind, but dog whistling is a time honored political tradition. Lee Atwater famously explained how it works in a 1981 interview. It gives you plausible deniability to wrap sentiments that would be offensive to some of your supporters if said directly in a more indirect and/or abstract way such that those you're trying to reach will fill in the parts you left out.

A relatively small share of people openly identify as racist, but many, if not most, people hold at least some racist views since these are the cultural waters we swim in. Dog whistling lets you have it both ways. When called out, the offender can always say: that's not what I meant or I was just joking. Then they can accuse the others of deliberately misconstruing their statements. And how the listener responds is largely a function of their prior beliefs. Again, most people don't want to think of themselves as racist so they will be generous to the dog whistler since to admit there was racism (or whatever ism) in the statement of someone they support would implicate them. And to the people it was intended, they will believe that the dog whistler is denying it not because they don't believe it but because they need to do so politically.

36. skeeter2020 ◴[] No.41918399{3}[source]
>> the elimination, marginalizes, dehumanizes or denies human or civil rights

but you've already lumped together a huge range of behaviours and impacts. Elimination? OK, we can probably broadly define that, but I just heard news reports with quotes of Israelis calling for the elimination of Hamas, and Iran the elimination of Israel. How do we handle that? marginalized? as defined by who? What about marginalizing undesirable behaviours or speech? What does "dehumanize" mean? Who's definition of human or civil rights?

replies(1): >>41919489 #
37. skeeter2020 ◴[] No.41918417{4}[source]
?? This can be done from both sides.

You are seeing this EXACT thing in the middle east right now.

38. baggy_trough ◴[] No.41918427{3}[source]
What do you think the definition is?
replies(1): >>41919197 #
39. mewpmewp2 ◴[] No.41919197{4}[source]
Any speech (or well communications) that intends to incite violence or general harm on groups of people or a person with certain characteristics such as age, sex, orientation, race, etc.

E.g. "X race/gender/sexual orientation are bad for the society for reason Y, and therefore they should be treated with Z (a negative consequence)"

So intending to call out harm because of certain inherent characteristics a group of people have, and such characteristics that are not harmful for the society.

replies(1): >>41919396 #
40. belorn ◴[] No.41919396{5}[source]
People will endlessly argue over what "incite violence or general harm" actually mean, and they will also endlessly argue over if something should be considered an implied "characteristics of age, sex, orientation, race, etc".

Currently there is an ongoing court case regarding the case where a person criticized and made offensive statements about Muhammad, and if that should count as inciting violence or general harm against Muslims. One side is arguing that any negative statements about Muhammad is veiled statements directed against Muslims as a group, and the other side is arguing it is criticism against the religion and not about people who believe in that religion. People did similar arguments with Monty Python movie Life of Brian.

When it comes to symbols like flags, people often characteristic any action (positive and negative) as a form of hate speech if they dislike it, or as important symbolic gestures when they like it. Burning flags get often called hate speech, and forbidding people from waving flags (including general rules against all flags for specific events) has also been called hate speech.

replies(1): >>41919504 #
41. mewpmewp2 ◴[] No.41919422{6}[source]
The spirit needs to be understood clearly. The spirit of hate speech is to by forms of communication cause harm to a person or group of people for certain inherent characteristics that are not harmful for the society. Harm in such a way that manifests itself as violence or discrimination.
42. danans ◴[] No.41919489{4}[source]
> Elimination? OK, we can probably broadly define that, but I just heard news reports with quotes of Israelis calling for the elimination of Hamas, and Iran the elimination of Israel. How do we handle that? marginalized? as defined by who?

I'd call both of them hate speech without qualification. But between countries, there's no legal system that would rule on speech (only actions, like the ICJ tries to adjudicate).

> What about marginalizing undesirable behaviours or speech?

What is the example of undesirable behavior being undertaken by a group that would warrant their marginalization as a group? I'm having a hard time finding an example of that. Calling out a group of racists or bigots (based on their words) for what they are isn't marginalization.

> What does "dehumanize" mean?

This has a very straightforward definition:

https://www.merriam-webster.com/dictionary/dehumanize

> Who's definition of human or civil rights?

In the US context, this is also well defined:

https://www.findlaw.com/civilrights/civil-rights-overview/wh...

43. mewpmewp2 ◴[] No.41919504{6}[source]
What was the negative statement? To me the logic is that:

1. Criticising a religion != hate speech, or generally making fun of or criticising any sort of religious figure like Muhammad or Jesus != hate speech.

2. Calling out for a group of people of certain religion to have X negative consequences = hate speech.