> Modern LLMs are fine-tuned for safety and instruction-following, meaning they are trained to refuse harmful requests.
It's sad that it's now an increasingly accepted idea that information one seeks can be "harmful".
replies(5):
It's sad that it's now an increasingly accepted idea that information one seeks can be "harmful".