←back to thread

222 points futurisold | 1 comments | | HN request time: 0.4s | source
Show context
krackers ◴[] No.44401923[source]
One question, OP, how does cost for this work? Do you pay the LLM inference cost (quite literally if using an external API) every time you run a line that involves natural language computation? E.g. what happens if you call a "symbolic" function in a loop.
replies(2): >>44403284 #>>44404403 #
1. demarq ◴[] No.44404403[source]
This will need a cache of some sort