This is horrifying. Please, go on. :D
How do I give it a base URL for API calls so I can point it at my ollama server?
replies(1):
llm install llm-ollama
and then you use whatever model you like, anything llm has installed. See https://llm.datasette.io/en/stable/plugins/installing-plugin... for plugin install info.Here is a sample session. You can't see it ... but it is very slow on my CPU-only non-apple machine (each response took like 30 seconds) :)
>>> from gremllm import Gremllm
>>> counter = Gremllm("counter", model="gemma3n:e2b")
>>> counter.value = 5
>>> counter.increment()
>>> counter.value
None
>>> counter.value
None
>>> counter.value
None
>>> counter.what_is_your_total()
6
6
... also I don't know why it kept saying my value is None :) . The "6" is doubled because one must have been a print and the other is the return value.