←back to thread

426 points benchmarkist | 2 comments | | HN request time: 0.418s | source
1. germanjoey ◴[] No.42179744[source]
Pretty amazing speed, especially considering this is bf16. But how many racks is this using? The used 4 racks for 70B, so this, what, at least 24? A whole data center for one model?!
replies(1): >>42182387 #
2. aurareturn ◴[] No.42182387[source]
Each Cerebras wafer scale chip has 44GB of SRAM. You need 972 GB of memory to run Llama 405b at fp16. So you need 22 of these.

I assume they're using SRAM only to achieve this speed and not HBM.