←back to thread

724 points simonw | 4 comments | | HN request time: 0s | source
Show context
xnx ◴[] No.44527256[source]
> It’s worth noting that LLMs are non-deterministic,

This is probably better phrased as "LLMs may not provide consistent answers due to changing data and built-in randomness."

Barring rare(?) GPU race conditions, LLMs produce the same output given the same inputs.

replies(7): >>44527264 #>>44527395 #>>44527458 #>>44528870 #>>44530104 #>>44533038 #>>44536027 #
troupo ◴[] No.44528870[source]
> Barring rare(?) GPU race conditions, LLMs produce the same output given the same inputs.

Are these LLMs in the room with us?

Not a single LLM available as a SaaS is deterministic.

As for other models: I've only run ollama locally, and it, too, provided different answers for the same question five minutes apart

Edit/update: not a single LLM available as a SaaS's output is deterministic, especially when used from a UI. Pointing out that you could probably run a tightly controlled model in a tightly controlled environment to achieve deterministic output is very extremely irrelevant when describing output of grok in situations when the user has no control over it

replies(5): >>44528884 #>>44528892 #>>44528898 #>>44528952 #>>44528971 #
eightysixfour ◴[] No.44528892[source]
The models themselves are mathematically deterministic. We add randomness during the sampling phase, which you can turn off when running the models locally.

The SaaS APIs are sometimes nondeterministic due to caching strategies and load balancing between experts on MoE models. However, if you took that model and executed it in single user environment, it could also be done deterministically.

replies(1): >>44528944 #
1. troupo ◴[] No.44528944[source]
> However, if you took that model and executed it in single user environment,

Again, are those environments in the room with us?

In the context of the article, is the model executed in such an environment? Do we even know anything about the environment, randomness, sampling and anything in between or have any control over it (see e.g https://news.ycombinator.com/item?id=44528930)?

replies(1): >>44531825 #
2. mathiaspoint ◴[] No.44531825[source]
It's very poor communication. They absolutely do not have to be non-deterministic.
replies(1): >>44532052 #
3. troupo ◴[] No.44532052[source]
The output of all these systems used by people not through API is non-deterministic.
replies(1): >>44537068 #
4. troupo ◴[] No.44537068{3}[source]
I would also assume that in vast majority of cases people don't set temperature to zero even with API calls.

And even if you do set it to zero, you never know what changes to the layers and layers of wrappers and system prompts you will run into on any given day resulting in "on this day we crash for certain input, and on other days we don't": https://www.techdirt.com/2024/12/03/the-curious-case-of-chat...