←back to thread

261 points david927 | 1 comments | | HN request time: 0s | source

What are you working on? Any new ideas that you're thinking about?
Show context
AJRF ◴[] No.43156818[source]
I recently made a little tool for people interested in running local LLMs to figure out if their hardware is able to run an LLM in GPU memory.

https://canirunthisllm.com/

replies(10): >>43156837 #>>43156946 #>>43157271 #>>43157577 #>>43157623 #>>43157743 #>>43158600 #>>43159526 #>>43160623 #>>43163802 #
jakubmazanec ◴[] No.43157623[source]
It doesn't work for all GPU/device in Simple tab: "Exception: Failed to calculate information for model. Error: Could not extract VRAM from: System Shared".
replies(1): >>43157805 #
1. AJRF ◴[] No.43157805[source]
Ah sorry, I will fix that.