Incredible hardware. Love that I can also run local llms on mine. https://github.com/Aider-AI/aider/issues/4526
replies(3):
But yeah if you wanna run 600B+ weights models your gonna need an insane setup to run it locally.
Anyway, Apple SoC in M series is a huge leverage thanks to shared memory: VRAM size == RAM size so if you buy M chip with 128+ Gb memory, you’re pretty much able to run SOTA models locally, and price is significantly lower than AI GPU cards