←back to thread

602 points emrah | 1 comments | | HN request time: 0s | source
Show context
wtcactus ◴[] No.43743666[source]
They keep mentioning the RTX 3090 (with 24 GB VRAM), but the model is only 14.1 GB.

Shouldn’t it fit a 5060 Ti 16GB, for instance?

replies(3): >>43743691 #>>43743768 #>>43747505 #
1. oktoberpaard ◴[] No.43743768[source]
With a 128K context length and 8 bit KV cache, the 27b model occupies 22 GiB on my system. With a smaller context length you should be able to fit it on a 16 GiB GPU.