←back to thread

623 points magicalhippo | 1 comments | | HN request time: 0.367s | source
1. adam_arthur ◴[] No.42627043[source]
Finally!

First product that directly competes on price with Macs for local inferencing of large LLMs (higher RAM). And likely outperforms them substantially.

Definitely will upgrade my home LLM server if specs bear out.