←back to thread

Nvidia won, we all lost

(blog.sebin-nyshkim.net)
977 points todsacerdoti | 7 comments | | HN request time: 0.423s | source | bottom
Show context
__turbobrew__ ◴[] No.44468824[source]
> With over 90% of the PC market running on NVIDIA tech, they’re the clear winner of the GPU race. The losers are every single one of us.

I have been rocking AMD GPU ever since the drivers were upstreamed into the linux kernel. No regrets.

I have also realized that there is a lot out there in the world besides video games, and getting all in a huff about it isn’t worth my time or energy. But consumer gotta consoooooom and then cry and outrage when they are exploited instead of just walking away and doing something else.

Same with magic the gathering, the game went to shit and so many people got outraged and in a big huff but they still spend thousands on the hobby. I just stopped playing mtg.

replies(22): >>44468885 #>>44468985 #>>44469036 #>>44469146 #>>44469164 #>>44470357 #>>44470480 #>>44470607 #>>44471458 #>>44471685 #>>44471784 #>>44471811 #>>44472146 #>>44472400 #>>44473527 #>>44473828 #>>44473856 #>>44476633 #>>44485501 #>>44487391 #>>44489487 #>>44493815 #
darkoob12 ◴[] No.44470357[source]
I am not a gamer and don't why AMD GPUs aren't good enough. It's weird since both Xbox and PlayStation are using AMD GPUs.

I guess there games that you can only play on PC with Nvidia graphics. That begs the question why someone create a game and ignore large console market.

replies(9): >>44470371 #>>44470393 #>>44470614 #>>44470709 #>>44471156 #>>44472918 #>>44473655 #>>44476308 #>>44479276 #
1. npteljes ◴[] No.44471156[source]
What I experienced is that AI is a nightmare on AMD in Linux. There is a myriad of custom things that one needs to do, and even that just breaks after a while. Happened so much on my current setup (6600 XT) that I don't bother with local AI anymore, because the time investment is just not worth it.

It's not that I can't live like this, I still have the same card, but if I were looking to do anything AI locally with a new card, for sure it wouldn't be an AMD one.

replies(3): >>44471354 #>>44473307 #>>44473336 #
2. eden-u4 ◴[] No.44471354[source]
I don't have much experience with ROCm for large trainings, but NVIDIA is still shit with driver+cuda version+other things. The only simplification is due to ubuntu and other distros that already do the heavy lift by installing all required components, without much configuration.
replies(2): >>44471743 #>>44474569 #
3. npteljes ◴[] No.44471743[source]
Oh I'm sure. The thing is that with AMD I have the same luxury, and the wretched thing still doesn't work, or has regressions.
4. phronimos ◴[] No.44473307[source]
Are you referring to AI training, prediction/inference, or both? Could you give some examples for what had to be done and why? Thanks in advance.
replies(1): >>44473522 #
5. FredPret ◴[] No.44473336[source]
I set up a deep learning station probably 5-10 years ago and ran into the exact same issue. After a week of pulling out my hair, I just bought an Nvidia card.
6. npteljes ◴[] No.44473522[source]
Sure! I'm referring to setting up a1111's stable diffusion webui, and setting up Open WebUI.

Wrt/ a1, it worked at one point (a year ago) after 2-3 hours of tinkering, then regressed to not working at all, not even from fresh installs on new, different Linuxes. I tried the main branch and the AMD specific fork as well.

Wrt/ Open WebUI, it works, but the thing uses my CPU.

7. int_19h ◴[] No.44474569[source]
On Ubuntu, in my experience, installing the .deb version of the CUDA toolkit pretty much "just works".