> Modern LLMs are fine-tuned for safety and instruction-following, meaning they are trained to refuse harmful requests.
It's sad that it's now an increasingly accepted idea that information one seeks can be "harmful".
replies(5):
It's sad that it's now an increasingly accepted idea that information one seeks can be "harmful".
It is indeed a problem that LLMs can instill a false sense of trust because it will confidently hallucinate. I see it as an education problem. You know and I know that LLMs can hallucinate and should not be trusted. The rest of the population needs to be educated on this fact as well.