←back to thread

277 points simianwords | 1 comments | | HN request time: 0.201s | source
1. charcircuit ◴[] No.45149197[source]
They shouldn't frame hallucination as a problem that is solvable provided they want to have a useful model (saying I don't know to every question is not useful). The data from the training may be wrong or out of date. Even doing a web search could find a common misconception instead of the actual answer.