←back to thread

724 points simonw | 1 comments | | HN request time: 0s | source
Show context
xnx ◴[] No.44527256[source]
> It’s worth noting that LLMs are non-deterministic,

This is probably better phrased as "LLMs may not provide consistent answers due to changing data and built-in randomness."

Barring rare(?) GPU race conditions, LLMs produce the same output given the same inputs.

replies(7): >>44527264 #>>44527395 #>>44527458 #>>44528870 #>>44530104 #>>44533038 #>>44536027 #
simonw ◴[] No.44527395[source]
I don't think those race conditions are rare. None of the big hosted LLMs provide a temperature=0 plus fixed seed feature which they guarantee won't return different results, despite clear demand for that from developers.
replies(3): >>44527634 #>>44529574 #>>44529823 #
xnx ◴[] No.44527634[source]
Fair. I dislike "non-deterministic" as a blanket llm descriptor for all llms since it implies some type of magic or quantum effect.
replies(4): >>44527956 #>>44528597 #>>44528690 #>>44529070 #
1. EdiX ◴[] No.44529070[source]
Interesting, but in general it does not imply that. For example: https://en.wikipedia.org/wiki/Nondeterministic_finite_automa...