←back to thread

1244 points adrianh | 1 comments | | HN request time: 0.26s | source
1. adamgordonbell ◴[] No.44491570[source]
We (others at company, not me) hit this problem, and not with chatgpt but with our own AI chatbot that was doing RAG on our docs. It was occasionally hallucinating a flag that didn't exist. So it was considered as product feedback. Maybe that exact flag wasn't needed, but something was missing and so the LLM hallucinated what it saw as an intuitive option.