That's quite fun, I wish it had more information about which model took that path and the inference / sampling parameters.
Thanks! Quick overview: Paths are deterministic, not LLM-generated. I use OpenAI text-embedding-3-large to build a word graph with K-nearest neighbors, then BFS finds the shortest path. No sampling involved. The explanations shown in-game are generated afterward by GPT-5 to explain the semantic jumps. Planning to write up the full architecture in a blog post - will share here when it's ready.