←back to thread

426 points benchmarkist | 1 comments | | HN request time: 0.295s | source
Show context
germanjoey ◴[] No.42179744[source]
Pretty amazing speed, especially considering this is bf16. But how many racks is this using? The used 4 racks for 70B, so this, what, at least 24? A whole data center for one model?!
replies(1): >>42182387 #
1. aurareturn ◴[] No.42182387[source]
Each Cerebras wafer scale chip has 44GB of SRAM. You need 972 GB of memory to run Llama 405b at fp16. So you need 22 of these.

I assume they're using SRAM only to achieve this speed and not HBM.