←back to thread

323 points steerlabs | 2 comments | | HN request time: 0.013s | source
Show context
jqpabc123 ◴[] No.46153440[source]
We are trying to fix probability with more probability. That is a losing game.

Thanks for pointing out the elephant in the room with LLMs.

The basic design is non-deterministic. Trying to extract "facts" or "truth" or "accuracy" is an exercise in futility.

replies(17): >>46155764 #>>46191721 #>>46191867 #>>46191871 #>>46191893 #>>46191910 #>>46191973 #>>46191987 #>>46192152 #>>46192471 #>>46192526 #>>46192557 #>>46192939 #>>46193456 #>>46194206 #>>46194503 #>>46194518 #
steerlabs ◴[] No.46155764[source]
Exactly. We treat them like databases, but they are hallucination machines.

My thesis isn't that we can stop the hallucinating (non-determinism), but that we can bound it.

If we wrap the generation in hard assertions (e.g., assert response.price > 0), we turn 'probability' into 'manageable software engineering.' The generation remains probabilistic, but the acceptance criteria becomes binary and deterministic.

replies(4): >>46163076 #>>46191658 #>>46191774 #>>46191967 #
1. scotty79 ◴[] No.46191658[source]
> We treat them like databases, but they are hallucination machines.

Which is kind of crazy because we don't even treat people as databases. Or at least we shouldn't.

Maybe it's one of those things that will disappear form culture one funeral at a time.

replies(1): >>46191851 #
2. hrimfaxi ◴[] No.46191851[source]
Humans demand more reliability from our creations than from each other.