←back to thread

69 points zora_goron | 1 comments | | HN request time: 0.208s | source
1. bulubulu ◴[] No.41914727[source]
Nice and neat tool! I wonder which llm model is running in the backend and is there any way to run it locally/self-hosted?