I'm trying to figure out how to safely and effectively add hallucination-free LLM generation to medical care at my company: https://www.wyndly.com/
Been playing with RAG on my co-founder's medical expertise.
Been playing with RAG on my co-founder's medical expertise.