←back to thread

The era of open voice assistants

(www.home-assistant.io)
878 points _Microft | 1 comments | | HN request time: 0.213s | source
1. gigel82 ◴[] No.42473397[source]
What is a good GPU to put in a home server that can run the TTS / STT and the local LLM required to make this shine?

A 3090 is too expensive and power hungry. Maybe a 3060 12Gb? Is there anything in the "workstation" lineup that is more efficient especially since I don't need the video outs?