←back to thread

168 points selvan | 1 comments | | HN request time: 0.209s | source
1. om8 ◴[] No.44462998[source]
I have a demo that runs llama3-{1,3,8}B in browser on cpu. It can be integrated with this thing in the future to be fully local

https://galqiwi.github.io/aqlm-rs