←back to thread

132 points harel | 1 comments | | HN request time: 0.199s | source
Show context
flykespice ◴[] No.45397008[source]
I am very dummy on LLMs, but wouldn't a confined model (no internet access) eventually just loop to repeating itself on each consecutive run or is entropy enough for them to produce endless creativity?
replies(3): >>45397107 #>>45397116 #>>45397913 #
1. parsimo2010 ◴[] No.45397116[source]
Loops can happen but you can turn the temperature setting up.

High temperature settings basically make an LLM choose tokens that aren’t the highest probability all the time, so it has a chance of breaking out of a loop and is less likely to fall into a loop in the first place. The downside is that most models will be less coherent but that’s probably not an issue for an art project.