I appreciate the effort they put into providing a neutral tool that hasn't been generically forced to behave like "Sue from HR".
replies(3):
The most recent news about chatbots is that ChatGPT coached a kid on how to commit suicide.
Two arguments come to mind. 1) it’s the sycophancy! Nous and its ilk should be considered safer. 2) it’s the poor alignment. A better trained model like Claude wouldn’t have done that.
I lean #2
Maybe every tool isn't meant for children or the mentally ill? When someone lets their kid play with a chainsaw that doesn't mean we should ban chainsaws, it means we should ban lousy parents.