←back to thread

586 points mizzao | 1 comments | | HN request time: 0.236s | source
Show context
olalonde ◴[] No.40667926[source]
> Modern LLMs are fine-tuned for safety and instruction-following, meaning they are trained to refuse harmful requests.

It's sad that it's now an increasingly accepted idea that information one seeks can be "harmful".

replies(5): >>40667968 #>>40668086 #>>40668163 #>>40669086 #>>40670974 #
1. stainablesteel ◴[] No.40670974[source]
very well said actually

the censoring frames everything as YOU being the problem. How dare YOU and your human nature think of these questions?

well its human nature that's kept us alive for the last million years or so, maybe we shouldn't try to censor our instincts