Can't you run small LLMs on like... a Macbook air M1? Some models are under 1B weights, they will be almost useless but I imagine you could run them on anything from the last 10 years.
But yeah if you wanna run 600B+ weights models your gonna need an insane setup to run it locally.