←back to thread

345 points kashifr | 2 comments | | HN request time: 0.582s | source
Show context
danielhanchen ◴[] No.44504715[source]
I fixed some chat template issues for llama.cpp and other inference engines! To run it, do:

./llama.cpp/llama-cli -hf unsloth/SmolLM3-3B-GGUF:Q4_K_XL --jinja -ngl 99

replies(2): >>44505656 #>>44507813 #
segmondy ◴[] No.44505656[source]
doing the good work, thanks daniel!
replies(1): >>44505857 #
1. danielhanchen ◴[] No.44505857[source]
Thank you!
replies(1): >>44509678 #
2. v5v3 ◴[] No.44509678[source]
Thanks