←back to thread

160 points redohmy | 1 comments | | HN request time: 0s | source
Show context
hypeatei ◴[] No.46009445[source]
Well, patience as a consumer might pay off in the next year or so when the music stops and hyperscalers are forced to dump their inventories.

There still isn't a clear path to profitability for any of these AI products and the capital expenditure has been enormous.

replies(4): >>46009810 #>>46009834 #>>46009854 #>>46011177 #
PaulKeeble ◴[] No.46009810[source]
Its a bit of a shame these AI GPUs don't actually have displayport/hdmi output ports because they would make for nice cheap and powerful gaming GPUs with a lot of VRAM, they would potentially be really good graphics cards.

Will just have to settle for insanely cheap second hand DDR5 and NVMe drives I guess.

replies(2): >>46010387 #>>46010886 #
sowbug ◴[] No.46010387[source]
I wouldn't mind my own offline Gemini or ChatGPT 5. But even if the hardware and model were free, I don't know how I'd afford the electricity.
replies(2): >>46011035 #>>46011409 #
1. jdprgm ◴[] No.46011035[source]
A single machine for personal inference on models of this size isn't going to idle at some point so high that electricity becomes a problem and for personal use it's not like it would be under load often and if for some reason you are able to keep it under heavy load presumably it's doing something valuable enough to easily justify the electricity.