LLaMA.go - open-source framework for LLM inference on regular CPUs [0]
It took me about a month of full-time, hard, day and night coding (including weekends) to finally build a solid piece which can handle some crazy CPU workloads of tensor math.
It took me about a month of full-time, hard, day and night coding (including weekends) to finally build a solid piece which can handle some crazy CPU workloads of tensor math.