←back to thread

168 points selvan | 2 comments | | HN request time: 0.42s | source
Show context
meander_water ◴[] No.44462587[source]
When I saw the title, I thought this was running models in the browser. IMO that's way more interesting and you can do it with transformers.js and onnx runtime. You don't even need a gpu.

https://huggingface.co/spaces/webml-community/llama-3.2-webg...

replies(1): >>44462627 #
1. salviati ◴[] No.44462627[source]
I think you _do_ need a GPU. But it can work with an integrated one, no need for a discrete one.

I can't run it on Linux since WebGPU is not working for me...

replies(1): >>44466383 #
2. pjmlp ◴[] No.44466383[source]
It is yet not available on any browser on Linux, other than Android/Linux and ChromeOS.