←back to thread

204 points warrenm | 1 comments | | HN request time: 0.247s | source
Show context
ryukoposting ◴[] No.45107700[source]
A footnote in the GPT-5 announcement was that you can now give OpenAI's API a context-free grammar that the LLM must follow. One way of thinking about this feature is that it's a user-defined world model. You could tell the model "the sky is" => "blue" for example.

Obviously you can't actually use this feature as a true world model. There's just too much stuff you have to codify, and basing such a system on tokens is inherently limiting.

The basic principle sounds like what we're looking for, though: a strict automata or rule set that steers the model's output reliably and provably. Perhaps a similar kind of thing that operates on neurons, rather than tokens? Hmm.

replies(4): >>45107903 #>>45108199 #>>45111729 #>>45112038 #
1. ijk ◴[] No.45112038[source]
Oh, OpenAI finally added it? Structured generation has been available in things like llama.cpp and Instructor for a while, so I was wondering if they were going to get around to adding it.

In the examples I've seen, it's not something you can define an entire world model in, but you can sure constrain the immediate action space so the model does something sensible.