←back to thread

1122 points felixrieseberg | 2 comments | | HN request time: 0.001s | source
1. alkh ◴[] No.43906263[source]
Great job! Having ollama support would be useful as well[1]! [1]https://github.com/ollama/ollama
replies(1): >>43911294 #
2. totetsu ◴[] No.43911294[source]
I thought this immediately also. I already have ollama set up to run llm tasks locally. I don't want to duplicate that but it would be fun to try this front end.