←back to thread

An LLM is a lossy encyclopedia

(simonwillison.net)
509 points tosh | 1 comments | | HN request time: 0.198s | source

(the referenced HN thread starts at https://news.ycombinator.com/item?id=45060519)
1. entropie ◴[] No.45102882[source]
I have a nvidia jetson orin nano with llama.ccp/ollama. Gemma3:4b / Gemma3-4b-it is awesome, reasonable fast (even with vision - i think its like 15t/s) and all that on a raspberry sized microcontroller.

Simons llm client tool is on every machine and I use it daily