←back to thread

602 points emrah | 10 comments | | HN request time: 0.628s | source | bottom
1. porphyra ◴[] No.43745158[source]
It is funny that Microsoft had been peddling "AI PCs" and Apple had been peddling "made for Apple Intelligence" for a while now, when in fact usable models for consumer GPUs are only barely starting to be a thing on extremely high end GPUs like the 3090.
replies(4): >>43745683 #>>43746489 #>>43746510 #>>43746780 #
2. ivape ◴[] No.43745683[source]
This is why the "AI hardware cycle is hype" crowd is so wrong. We're not even close, we're basically at ColecoVision/Atari stage of hardware here. It's going be quite a thing when everyone gets a SNES/Genesis.
3. icedrift ◴[] No.43746489[source]
Capable local models have been usable on Macs for a while now thanks to their unified memory.
4. NorwegianDude ◴[] No.43746510[source]
A 3090 is not a extremely high end GPU. Is a consumer GPU launched in 2020, and even in price and compute it's around a mid-range consumer GPU these days.

The high end consumer card from Nvidia is the RTX 5090, and the professional version of the card is the RTX PRO 6000.

replies(2): >>43746662 #>>43746696 #
5. zapnuk ◴[] No.43746662[source]
A 3090 still costs 1800€. Thats not mid-range by a long shot

The 5070 or 5070ti are mid range. They cost 650/900€.

replies(2): >>43746973 #>>43747297 #
6. dragonwriter ◴[] No.43746696[source]
For model usability as a binary yes/no, pretty much the only dimension that matters is VRAM, and at 24GB the 3090 is still high end for a consumer NVidia GPUs, yes, the 5090 (and only the 5090) is above it, at 32GB, but 24GB is way ahead of the mid-range.
replies(1): >>43747062 #
7. dragonwriter ◴[] No.43746780[source]
AI PCs aren't about running the kind of models that take a 3090-class GPU, or even running on GPU at all, but systems where the local end is running something like Phi-3.5-vision-instruct, on system RAM using a CPU with an integrated NPU, which is why the AI PC requirements specify an NPU, a certain amount of processing capacity, and a minimum amount of DDR5/LPDDR5 system RAM.
8. NorwegianDude ◴[] No.43746973{3}[source]
3090s are no longer produced, that's why new ones are so expensive. At least here, used 3090s are around €650, and a RTX 5070 is around €625.

It's definitely not extremely high end any more, the price is(at least here) the same as the new mid range consumer cards.

I guess the price can vary by location, but €1800 for a 3090 is crazy, that's more than the new price in 2020.

9. NorwegianDude ◴[] No.43747062{3}[source]
24 GB of VRAM is a large amount of VRAM on a consumer GPU, that I totally agree with you on. But it's definitely not an extremely high end GPU these days. It is suitable, yes, but not high end. The high end alternative for a consumer GPU would be the RTX 5090, but that is only available for €3000 now, while used 3090s are around €650.
10. sentimentscan ◴[] No.43747297{3}[source]
A year ago, I bought a brand-new EVGA hybrid-cooled 3090 Ti for 700 euros. I'm still astonished at how good of a decision it was, especially considering the scarcity of 24GB cards available for a similar price. For pure gaming, many cards perform better, but they mostly come with 12 to 16GB of VRAM.