If I gave you a gun without a safety could you be the one to blame when it goes off because you weren’t careful enough?
The problem with this analogy is that it makes no sense.
LLMs aren’t guns.
The problem with using them is that humans have to review the content for accuracy. And that gets tiresome because the whole point is that the LLM saves you time and effort doing it yourself. So naturally people will tend to stop checking and assume the output is correct, “because the LLM is so good.”
Then you get false citations and bogus claims everywhere.