←back to thread

168 points 1wheel | 1 comments | | HN request time: 0s | source
Show context
byteknight ◴[] No.40429457[source]
This reminds me of how people often communicate to avoid offending others. We tend to soften our opinions or suggestions with phrases like "What if you looked at it this way?" or "You know what I'd do in those situations." By doing this, we subtly dilute the exact emotion or truth we're trying to convey. If we modify our words enough, we might end up with a statement that's completely untruthful. This is similar to how AI models might behave when manipulated to emphasize certain features, leading to responses that are not entirely genuine.
replies(2): >>40429580 #>>40429898 #
nathan_compton ◴[] No.40429580[source]
Counterpoint: "What if you looked at it this way?" communicates both your suggestion AND your sensitivity to the person's social status whatever. Given that humans are not robots, but social, psychological, animals, such communication is entirely justified and efficient.
replies(2): >>40429636 #>>40429721 #
harshaxnim ◴[] No.40429721[source]
Sadly "sensitivity" has been over done. It's a fine line and corporations would rather cross it for legal/social reasons. Similar to how too much political correctness will hamper the society, so does the overly done sensitivity in an agent, be it a human, or AI.
replies(1): >>40430039 #
1. nathan_compton ◴[] No.40430039{3}[source]
That might be the case, but how and who determines how much is too much? I mean in the case of AI, let the market decide seems like the right answer.