←back to thread

1311 points msoad | 1 comments | | HN request time: 0.212s | source
Show context
brucethemoose2 ◴[] No.35393393[source]
Does that also mean 6GB VRAM?

And does that include Alpaca models like this? https://huggingface.co/elinas/alpaca-30b-lora-int4

replies(2): >>35393441 #>>35393450 #
1. terafo ◴[] No.35393441[source]
No(llama.cpp is cpu-only) and no(you need to requantize the model).