←back to thread

425 points karimf | 2 comments | | HN request time: 0s | source
Show context
miki123211 ◴[] No.45656279[source]
> Try asking any of them “Am I speaking in a low voice or a high voice?” in a high-pitched voice, and they won’t be able to tell you.

I wonder how much of that is LLMs being bad, and how much is LLMs being (over) aligned not to do it.

AFAIK, Chat GPT Voice mode had to have a lot of safeguards put on it to prevent music generation, accent matching (if you sound Indian, it shouldn't also sound Indian), and assuming ethnicity / biasing based on accents.

It doesn't seem that impossible to me that some of these behaviors have been aligned out of these models out of an abundance of caution.

replies(7): >>45656408 #>>45656467 #>>45656667 #>>45657021 #>>45657291 #>>45658995 #>>45665432 #
tsol ◴[] No.45656667[source]
Did they respond differently depending on what race they thought you were? I'm surprised they would even do that honestly. I thought they were trained on text conversations which presumably wouldn't have any of that to learn from.
replies(4): >>45656799 #>>45656985 #>>45657478 #>>45664768 #
thwarted ◴[] No.45656985[source]
If it did, it responded based on the accent it picked up on not race, because race and accent are orthogonal, correlation does not imply causation.
replies(1): >>45659653 #
1. dotancohen ◴[] No.45659653{3}[source]
Are denying that race and accent are highly correlated?
replies(1): >>45663273 #
2. thwarted ◴[] No.45663273[source]
No, I'm saying that it is more meaningful to use what is directly derived rather than what is an indirect assumption. There is already issues with people erroneously considering whatever LLMs output as truth, the last thing anyone needs is an LLM claiming someone like Idris Elba is a white Briton because of his accent. We don't need automated phrenology machines, and that's what "determined your race from your voice" is pretty close to.