Most active commenters

    ←back to thread

    Nvidia won, we all lost

    (blog.sebin-nyshkim.net)
    981 points todsacerdoti | 28 comments | | HN request time: 0.937s | source | bottom
    1. cherioo ◴[] No.44468628[source]
    High end GPU has over the last 5 years slowly turning from an enthusiast product into a luxury product.

    5 or maybe 10 years ago, high-end GPU are needed to run games at reasonably eye candy setting. In 2025, $500 mid-range GPUs are more than enough. Folks all over can barely tell between High and Ultra settings, DLSS vs FSR, or DLSS FG and Lossless Scaling. There's just no point to compete at $500 price point any more, that Nvidia has largely given up and relegating to the AMD-built Consoles, and integrated graphics like AMD APU, that offer good value in low-end, medium-end, and high-end.

    Maybe the rumored Nvidia PC, or the Switch 2, can bring some resurgence.

    replies(7): >>44468693 #>>44468874 #>>44469261 #>>44469281 #>>44469585 #>>44470770 #>>44473016 #
    2. ohdeargodno ◴[] No.44468693[source]
    Not quite $500, but at $650, the 9070 is an absolute monster that outperforms Nvidia's equivalent cards in everything but ray tracing (which you can only turn on with full DLSS framegen and get a blobby mess anyways)

    AMD is truly making excellent cards, and with a bit of luck UDNA is even better. But they're in the same situation as Nvidia: they could sell 200 GPUs, ship drivers, maintain them, deal with returns and make $100k... Or just sell a single MI300X to a trusted partner that won't make any waves and still make $100k.

    Wafer availability unfortunately rules all, and as it stands, we're lucky neither of them have abandoned their gaming segments for massively profitable AI things.

    replies(2): >>44468760 #>>44468767 #
    3. enraged_camel ◴[] No.44468760[source]
    I have a 2080 that I'm considering upgrading but not sure which 50 series would be the right choice.
    replies(3): >>44469040 #>>44469527 #>>44470671 #
    4. cosmic_cheese ◴[] No.44468767[source]
    Some models of 9070 use the well-proven old style PCI-E power connectors too, which is nice. As far as I'm aware none of the current AIB midrange or high end Nvidia cards do this.
    replies(1): >>44470296 #
    5. dukeyukey ◴[] No.44468874[source]
    I bought a new machine with an RTX 3060 Ti back in 2020 and it's still going strong, no reason to replace it.
    replies(1): >>44470320 #
    6. thway15269037 ◴[] No.44469040{3}[source]
    Grab a used/refurb 3090 then. Probably as legendary card as a 1080Ti.
    replies(1): >>44469306 #
    7. gxs ◴[] No.44469261[source]
    I think this is the even broader trend here

    In their never ending quest to find ways to suck more money out of people, one natural extension is to just turn the thing into a luxury good and that alone seems to justify the markup

    This is why new home construction is expensive - the layout of a home doesn’t change much but it’s trivial to throw on some fancy fixtures and slap the deluxe label on the listing.

    Or take a Toyota, slap some leather seats on it, call it a Lexus and mark up the price 40% (I get that these days there are more meaningful differences but the point stands)

    This and turning everything into subscriptions alone are responsible for 90% of the issues I have as a consumer

    Graphics cards seem to be headed in this direction as well - breaking through that last ceiling for maximum fps is going to be like buying a bentley (if it isn’t already) where as before it was just opting for the v8

    replies(1): >>44469343 #
    8. Tadpole9181 ◴[] No.44469281[source]
    Just going to focus on this one:

    > DLSS vs FSR, or DLSS FG and Lossless Scaling.

    I've used all of these (at 4K, 120hz, set to "balanced") since they came out, and I just don't understand how people say this.

    FSR is a vaseline-like mess to me, it has its own distinct blurriness. Not as bad as naive upscaling, and I'll use it if no DLSS is available and the game doesn't run well, but it's distracting.

    Lossless is borderline unusable. I don't remember the algorithm's name, but it has a blur similar to FSR. It cannot handle text or UI elements without artifacting (because it's not integrated in the engine, those don't get rendered at native resolution). The frame generation causes almost everything to have a ghost or afterimage - UI elements and the reticle included. It can also reduce your framerate because it's not as optimized. On top of that, the way the program works interferes with HDR pipelines. It is a last resort.

    DLSS (3) is, by a large margin, the best offering. It just works and I can't notice any cons. Older versions did have ghosting, but it's been fixed. And I can retroactively fix older games by just swapping the DLL (there's a tool for this on GitHub, actually). I have not tried DLSS 4.

    replies(2): >>44469369 #>>44469636 #
    9. k12sosse ◴[] No.44469306{4}[source]
    Just pray that it's a 3090 under that lid when you buy it second hand
    10. bigyabai ◴[] No.44469343[source]
    Nvidia's been doing this for a while now, since at least the Titan cards and technically the SLI/Crossfire craze too. If you sell it, egregiously-compensated tech nerds will show up with a smile and a wallet large enough to put a down-payment on two of them.

    I suppose you could also blame the software side, for adopting compute-intensive ray tracing features or getting lazy with upscaling. But PC gaming has always been a luxury market, at least since "can it run Crysis/DOOM" was a refrain. The homogeneity of a console lineup hasn't ever really existed on PC.

    11. paulbgd ◴[] No.44469369[source]
    I’ve used fsr 4 and dlss 4, I’d say fsr 4 is a bit ahead of dlss 3 but behind dlss 4. No more vaseline smear
    12. magicalhippo ◴[] No.44469527{3}[source]
    I went from a 2080 Ti to a 5070 Ti. Yes it's faster, but for the games I play, not dramatically so. Certainly not what I'm used to doing such a generational leap. The 5070 Ti is noticeably faster at local LLMs, and has a bit more memory which is nice.

    I went with the 5070 Ti since the 5080 didn't seem like a real step up, and the 5090 was just too expensive and wasn't in stock for ages.

    If I had a bit more patience, I would have waited till the next node refresh, or for the 5090. I don't think any of the other current 50-series cards are worth besides the 5090 it if you're coming from a 2080. And by worth it I mean will give you a big boost in performance.

    13. piperswe ◴[] No.44469585[source]
    10 years ago, $650 would buy you a top-of-the-line gaming GPU (GeForce GTX 980 Ti). Nowadays, $650 might get you a mid-range RX 9070 XT if you miraculously find one near MSRP.
    replies(4): >>44469658 #>>44469670 #>>44469685 #>>44476352 #
    14. cherioo ◴[] No.44469636[source]
    Maybe I over exaggerated, but I was dumbfounded myself reading people’s reaction to Lossless Scaling https://www.reddit.com/r/LinusTechTips/s/wlaoHl6GAS

    Most people either can’t tell the difference, don’t care about the difference, or both. Similar discourse can be found about FSR, frame drop, and frame stutter. I have conceded that most people do not care.

    15. ksec ◴[] No.44469658[source]
    That is $880 dollars in today's term. And 2015 Apple was already shipping a 16nm SoC. The GeForce GTX 980 Ti was still on 28nm. Two generation Node behind.
    16. conception ◴[] No.44469670[source]
    Keeping with inflation (650 to 880) it’d get you a 5070TI.
    replies(1): >>44472371 #
    17. wasabi991011 ◴[] No.44469685[source]
    $650 of 2015 USD is around $875 of 2025 USD fwiw
    18. Henchman21 ◴[] No.44470296{3}[source]
    As I understand it, for the 50-series nvidia requires the 12VHPWR connector
    19. rf15 ◴[] No.44470320[source]
    same, 2080 Super here, I even do AI with it
    20. Rapzid ◴[] No.44470671{3}[source]
    I went from a 3070 to 5070 Ti and it's fantastic. Just finished Cyberpunk Max'd out at 4k with DLSS balanced, 2x frame gen, and reflex 2. Amazing experience.
    21. datagram ◴[] No.44470770[source]
    The fact that we're calling $500 GPUs "midrange" is proof that Nvidia's strategy is working.
    replies(2): >>44471170 #>>44472153 #
    22. WithinReason ◴[] No.44471170[source]
    What strategy? They charge more because manufacturing costs are higher, cost per transistor haven't changed much since 28nm [0] but chips have more and more transistors. What do you think that does to the price?

    [0]: https://www.semiconductor-digest.com/moores-law-indeed-stopp...

    replies(1): >>44473732 #
    23. blueboo ◴[] No.44472153[source]
    I think my TNT2 Ultra was $200. But Nvidia had dozens of competitors back then. 89 when it was founded! Now: AMD…
    24. orphea ◴[] No.44472371{3}[source]

      5070TI
    
    Which, performance-wise, is a 60TI class card.
    25. luisgvv ◴[] No.44473016[source]
    Absolutely right, only AAA games get to showcase the true power of GPUs.

    For cheaper guys like me, I'll just give my son indie and low graphic games which he enjoys

    26. NooneAtAll3 ◴[] No.44473732{3}[source]
    strategy of marketting expensive product as normal one? obviously?

    if your product can't be cheap - your product is luxury, not a day-to-day one

    replies(1): >>44474028 #
    27. WithinReason ◴[] No.44474028{4}[source]
    It's mid range. The range shifted.
    28. AngryData ◴[] No.44476352[source]
    I don't know how you can consider a 9070 XT a midrange card, it is AMD's second best card in benchmarks and only came out 5 months ago.