←back to thread

192 points redohmy | 1 comments | | HN request time: 0.209s | source
Show context
hypeatei ◴[] No.46009445[source]
Well, patience as a consumer might pay off in the next year or so when the music stops and hyperscalers are forced to dump their inventories.

There still isn't a clear path to profitability for any of these AI products and the capital expenditure has been enormous.

replies(4): >>46009810 #>>46009834 #>>46009854 #>>46011177 #
PaulKeeble ◴[] No.46009810[source]
Its a bit of a shame these AI GPUs don't actually have displayport/hdmi output ports because they would make for nice cheap and powerful gaming GPUs with a lot of VRAM, they would potentially be really good graphics cards.

Will just have to settle for insanely cheap second hand DDR5 and NVMe drives I guess.

replies(2): >>46010387 #>>46010886 #
sowbug ◴[] No.46010387[source]
I wouldn't mind my own offline Gemini or ChatGPT 5. But even if the hardware and model were free, I don't know how I'd afford the electricity.
replies(2): >>46011035 #>>46011409 #
1. mitthrowaway2 ◴[] No.46011409[source]
If you can't afford the electricity to afford to run the model on free hardware, you'd certainly never be able to afford the subscription to the same product as a service!

But anyway, the trick is to run it in the winter and keep your house warm.