←back to thread

The era of open voice assistants

(www.home-assistant.io)
879 points _Microft | 1 comments | | HN request time: 0s | source
Show context
Havoc ◴[] No.42469547[source]
Had to laugh a bit at the caveat about powerful hardware. Was bracing myself for GPU and then it says N100 lol
replies(1): >>42469595 #
moooo99 ◴[] No.42469595[source]
I mean, comparatively many people are hosting their home Assistant on an raspberry Pi so it is relatively powerful :D
replies(1): >>42470788 #
geerlingguy ◴[] No.42470788[source]
And the CM5 is nearly equivalent in terms of the small models you run. Latency is nearly the same, though you can get a little more fancy if you have an N100 system with more RAM, and "unlocked" thermals (many N100 systems cap the power draw because they don't have the thermal capacity to run the chip at max turbo).
replies(1): >>42471342 #
1. moffkalast ◴[] No.42471342{3}[source]
If we're being fair you can more like, walk models, not run them :)

An 125H box may be three times the price of an N100 box, but the power draw is about the same (6W idle, 28W max, with turbo off anyway) and with the Arc iGPU the prompt processing is in the hundreds, so near instant replies to longer queries are doable.