←back to thread

343 points sillysaurusx | 3 comments | | HN request time: 0.978s | source
1. yumraj ◴[] No.35032931[source]
What’s the minimum single GPU that’ll work for the smallest model?
replies(2): >>35033135 #>>35033258 #
2. downvotetruth ◴[] No.35033135[source]
3060 12GB
3. zargon ◴[] No.35033258[source]
This reddit post says that the 7B model consumes about 9.7GB of VRAM (using int8). I'm sure very soon people will add support for using system RAM as swap space, which will allow you to use it on an 8GB card, though with a fairly hefty performance penalty.

https://www.reddit.com/r/MachineLearning/comments/11h3p2x/d_...