Nice.
This is different from human hallucinations where it makes something up because of something wrong with the mind rather than some underlying issue with the brain's architecture.
> In the field of artificial intelligence (AI), a hallucination or artificial hallucination (also called confabulation,[1] or delusion)[2] is a response generated by AI that contains false or misleading information presented as fact.[3][4]
You say
> This is different from human hallucinations where it makes something up because of something wrong with the mind rather than some underlying issue with the brain's architecture.
For consistency you might as well say everything the human mind does is hallucination. It's the same sort of claim. This claim at least has the virtue of being taken seriously by people like Descartes.
https://en.wikipedia.org/wiki/Hallucination_(artificial_inte...
It's possible LLMs are lying but my guess is that they really just can't tell the difference.