←back to thread

577 points simonw | 1 comments | | HN request time: 0s | source
Show context
joelthelion ◴[] No.44724227[source]
Apart from using a Mac, what can you use for inference with reasonable performance? Is a Mac the only realistic option at the moment?
replies(6): >>44724398 #>>44724419 #>>44724553 #>>44724563 #>>44724959 #>>44727049 #
1. AlexeyBrin ◴[] No.44724398[source]
A gaming PC with an NVIDIA 4090/5090 will be more than adequate for running local models.

Where a Mac may beat the above is on the memory side, if a model requires more than 24/32 GB of GPU memory you are usually better off with a Mac with 64/128 GB of RAM. On a Mac the memory is shared between CPU and GPU, so the GPU can load larger models.