> Modern LLMs are fine-tuned for safety and instruction-following, meaning they are trained to refuse harmful requests.
It's sad that it's now an increasingly accepted idea that information one seeks can be "harmful".
replies(5):
the censoring frames everything as YOU being the problem. How dare YOU and your human nature think of these questions?
well its human nature that's kept us alive for the last million years or so, maybe we shouldn't try to censor our instincts