←back to thread

Gremllm

(github.com)
121 points andreabergia | 1 comments | | HN request time: 0.208s | source
Show context
taneq ◴[] No.44470265[source]
This is horrifying. Please, go on. :D

How do I give it a base URL for API calls so I can point it at my ollama server?

replies(1): >>44474986 #
1. awwaiid ◴[] No.44474986[source]
It is using the llm library, so you do the plugin and model management through that. Let's say you've already gotten ollama installed and the `gemma3n:e2b` model. Then you use the llm cli to add the ollama extension:

  llm install llm-ollama
and then you use whatever model you like, anything llm has installed. See https://llm.datasette.io/en/stable/plugins/installing-plugin... for plugin install info.

Here is a sample session. You can't see it ... but it is very slow on my CPU-only non-apple machine (each response took like 30 seconds) :)

  >>> from gremllm import Gremllm
  >>> counter = Gremllm("counter", model="gemma3n:e2b")
  >>> counter.value = 5
  >>> counter.increment()
  >>> counter.value
  None
  >>> counter.value
  None
  >>> counter.value
  None
  >>> counter.what_is_your_total() 
  6
  6
... also I don't know why it kept saying my value is None :) . The "6" is doubled because one must have been a print and the other is the return value.