←back to thread

34 points CHEF-KOCH | 6 comments | | HN request time: 0.866s | source | bottom
1. speedgoose ◴[] No.44407998[source]
A maximum of 8GB vRAM in 2025 is quite limiting. You can only run the smallest large language models on those laptops.
replies(2): >>44408124 #>>44409041 #
2. SlowTao ◴[] No.44408124[source]
Maybe this laptop is not for you then. That's cool.

But if you don't need to run LLM's these will be fine.

replies(1): >>44408275 #
3. speedgoose ◴[] No.44408275[source]
Yes I guess some could also be fine with a privacy focused mechanical typewriter.

When Apple and AMD sell very descent laptop hardware (M4 or Ryzen AI Max), I find the Intel/Nvidia combo with only 8GB a bit conservative.

4. marssaxman ◴[] No.44409041[source]
Your world must be so different from mine! It would never occur to me to care about such a thing.

I was content with 8GB of main RAM in my dev laptop until about a year ago.

replies(1): >>44409267 #
5. exiguus ◴[] No.44409267[source]
The vRam for example limits you running local llms. Basically, the formula for this is: 1GB of vRAM can handle 1B parameters. There are ways to overcome this, but the easiest way for the best performance, is just to have enough vRAM.
replies(1): >>44410285 #
6. marssaxman ◴[] No.44410285{3}[source]
If running local LLMs were a thing I ever wanted to do, I'm sure that would be something that mattered to me.