←back to thread

Nvidia won, we all lost

(blog.sebin-nyshkim.net)
977 points todsacerdoti | 9 comments | | HN request time: 0.692s | source | bottom
1. mcdeltat ◴[] No.44470051[source]
Anyone else getting a bit disillusioned with the whole tech hardware improvements thing? Seems like every year we get less improvement for higher cost and the use cases become less useful. Like the whole industry is becoming a rent seeking exercise with diminishing returns. I used to follow hardware improvements and now largely don't because I realised I (and probably most of us) don't need it.

It's staggering that we are throwing so many resources at marginal improvements for things like gaming, and I say that as someone whose main hobby used to be gaming. Ray tracing, path tracing, DLSS, etc at a price point of $3000 just for the GPU - who cares when a 2010 cell shaded game running on an upmarket toaster gave me the utmost joy? And the AI use cases don't impress me either - seems like all we do each generation is burn more power to shove more data through and pray for an improvement (collecting sweet $$$ in the meantime).

Another commenter here said it well, there's just so much more you can do with your life than follow along with this drama.

replies(5): >>44470105 #>>44470127 #>>44470261 #>>44471393 #>>44478345 #
2. bamboozled ◴[] No.44470105[source]
I remember when it was a serious difference, like PS1-PS3 was absolutely miraculous and exciting to watch.

It's also fun that no matter how fast the hardware seems to get, we seem to fill it up with shitty bloated software.

replies(1): >>44472649 #
3. philistine ◴[] No.44470127[source]
Your disillusionment is warranted, but I'll say that on the Mac side the grass has never been greener. The M chips are screamers year after year, the GPUs are getting ok, the ML cores are incredible and actually useful.
replies(2): >>44471707 #>>44478351 #
4. seydor ◴[] No.44470261[source]
Our stock investments are going up so ...... What can we do other than shrug
5. keyringlight ◴[] No.44471393[source]
What stands out to me is that it's not just the hardware side, software production to make use of it to realize the benefits offered doesn't seem to be running smoothly either, at least for gaming. I'm not sure nvidia really cares too much though as there's no market pressure on them where it's a weakness for them, if consumer GPUs disappeared tomorrow they'd be fine.

A few months ago Jensen Huang said he sees quantum computing as the next big thing he wants nvidia to be a part of over the next 10-15 years (which seems like a similar timeline as GPU compute), so I don't think consumer GPUs are a priority for anyone. Gaming used to be the main objective with byproducts for professional usage, for the past few years that's reversed where gaming piggybacks on common aspects to compute.

6. mcdeltat ◴[] No.44471707[source]
Good point, we should commend genuinely novel efforts towards making baseline computation more efficient, like Apple has done as you say. Particularly in light of recent x86 development which seems to be "shove as many cores as possible on a die and heat your apartment while your power supply combusts" (meanwhile the software gets less efficient by the day, but that's another thing altogether...). ANY DAY of the week I will take a compute platform that's no-bs no-bells-and-whistles simply more efficient without the manufacturer trying to blow smoke up our asses.
7. mcdeltat ◴[] No.44472649[source]
IMO at some point in the history of software we lost track of hardware capabilities versus software end outcomes. Hardware improved many orders of magnitude but overall software quality/usefulness/efficiency did not (yes this is a hill I will die on). We've ended up with mostly garbage and an occasional legitimately brilliant use of transistors.
8. ◴[] No.44478345[source]
9. hot_gril ◴[] No.44478351[source]
Yeah, going from Intel to M1 was a huge improvement, but not in every way. So now they're closing all the other gaps, and it's getting even better.