Can you actually make the LLM more reliable tho ?
As far as I know, llm hallucinations are inherent to them and will never be completely removed. If I book a flight, i want 100,0% reliability, Not 99% ( which we are still far away today).
People got to take llm for what they are, good bullshiter, awesome to translate text or reformulate words but it's not designed to have thought or be an alternate secretary. Merely a secretary tool.