←back to thread

223 points benkaiser | 1 comments | | HN request time: 0.21s | source
Show context
MrQuincle ◴[] No.42542757[source]
"I've often found myself uneasy when LLMs (Large Language Models) are asked to quote the Bible. While they can provide insightful discussions about faith, their tendency to hallucinate responses raises concerns when dealing with scripture, which we regard as the inspired Word of God."

Interesting. In my very religious upbringing I wasn't allowed to read fairy tales. The danger being not able to classify which stories truly happened and which ones didn't.

Might be an interesting variant on the Turing test. Can you make the AI believe in your religion? Probably there's a sci-fi book written about it.

replies(3): >>42542806 #>>42547437 #>>42554257 #
1. aptsurdist ◴[] No.42542806[source]
To be fair, the Bible’s authors also seemed to have hallucinated the word of God. At least in cases of contradictions between authors.