←back to thread

242 points simonebrunozzi | 1 comments | | HN request time: 0s | source
Show context
sowbug ◴[] No.46236648[source]
Something like this would be perfect for a local LLM assistant.
replies(2): >>46236776 #>>46237124 #
1. tbeseda ◴[] No.46237124[source]
Agreed. I'm working on a small GUI that just appends to a local .ndjson file. A user just posts with a text box into a feed. Like a one person chat or tweeting into the void. And a local LLM picks apart metadata, storing just enough to index where answers to future questions will be. Then you can use slash commands to get at the analysis like "/tasks last month" or "/summarize work today" etc.