←back to thread

Human

(quarter--mile.com)
717 points surprisetalk | 2 comments | | HN request time: 0.404s | source
Show context
notepad0x90 ◴[] No.43991964[source]
Why are emotions so special? they're just algorithms like any other. Emotions aren't what make humans different than machines. feeling something is similar to an LLM model reacting to a prompt a certain way. Just because chatgpt is trained to not "feel" anything (to avoid controversial output) doesn't mean LLMs can't feel things like we do. self-awareness, self-training, adaptability, original thinking, critical thinking,etc.. are different questions. but I see no reason why machines can't receive input/stimuli and react/output by the same way we do because of how they feel about the input.
replies(5): >>43992068 #>>43992108 #>>43992143 #>>43994759 #>>43995050 #
jplusequalt ◴[] No.43994759[source]
>Why are emotions so special? they're just algorithms like any other

Nobody understands what emotions are. Nor can they predict which emotion someone will feel in a given situation, or how they'll act under the influence of that emotion. Emotions aren't the mechanism by which humans solve problems, and rather they are often an obstacle to overcome. Emotions also aren't "finite" or "rigorous" as those terms aren't applicable to ephemeral phenomenon.

This is the kind of confidently incorrect statements people who work on software say that irks me. Not everything in life has a nice and simple parallel to computer science. Just because a person can abstract about one subject well, doesn't mean their tools of abstraction can be applied to all other subjects.

replies(2): >>43995199 #>>43995273 #
bsza ◴[] No.43995273[source]
No one can predict what an LLM will say in a given situation either, except by running it. No one can even predict what a double pendulum will do next. If anything makes emotions exclusive to us at all, it’s certainly not predictability.
replies(1): >>43995437 #
1. jplusequalt ◴[] No.43995437[source]
Okay, semantics aside, my comment was getting at something else. I'm arguing against a certain kind of reductive viewpoint software peeps tend to indulge in.
replies(1): >>43995633 #
2. bsza ◴[] No.43995633[source]
I agree with that sentiment, I just don’t see OP making that kind of reductionist claim. They said emotions are algorithms, not that the algorithm is simple or even deterministic.