←back to thread

350 points kashifr | 3 comments | | HN request time: 0.001s | source
Show context
danielhanchen ◴[] No.44504715[source]
I fixed some chat template issues for llama.cpp and other inference engines! To run it, do:

./llama.cpp/llama-cli -hf unsloth/SmolLM3-3B-GGUF:Q4_K_XL --jinja -ngl 99

replies(2): >>44505656 #>>44507813 #
1. segmondy ◴[] No.44505656[source]
doing the good work, thanks daniel!
replies(1): >>44505857 #
2. danielhanchen ◴[] No.44505857[source]
Thank you!
replies(1): >>44509678 #
3. v5v3 ◴[] No.44509678[source]
Thanks