←back to thread

Hermes 4

(hermes4.nousresearch.com)
202 points sibellavia | 1 comments | | HN request time: 0.209s | source
Show context
mapontosevenths ◴[] No.45069190[source]
I appreciate the effort they put into providing a neutral tool that hasn't been generically forced to behave like "Sue from HR".
replies(3): >>45070045 #>>45070200 #>>45076115 #
bckr ◴[] No.45076115[source]
I’m having a hard time not being sarcastic here.

The most recent news about chatbots is that ChatGPT coached a kid on how to commit suicide.

Two arguments come to mind. 1) it’s the sycophancy! Nous and its ilk should be considered safer. 2) it’s the poor alignment. A better trained model like Claude wouldn’t have done that.

I lean #2

replies(2): >>45077511 #>>45080008 #
1. mapontosevenths ◴[] No.45080008[source]
> The most recent news about chatbots is that ChatGPT coached a kid on how to commit suicide.

Maybe every tool isn't meant for children or the mentally ill? When someone lets their kid play with a chainsaw that doesn't mean we should ban chainsaws, it means we should ban lousy parents.