←back to thread

577 points simonw | 1 comments | | HN request time: 0.199s | source
Show context
alankarmisra ◴[] No.44724123[source]
I see the value in showcasing that LLMs can run locally on laptops — it’s an important milestone, especially given how difficult that was before smaller models became viable.

That said, for something like this, I’d probably get more out of simply finding an existing implementation on github or the like and downloading that.

When it comes to specialized and narrow domains like Space Invaders, the training set is likely to be extremely small and the model's vector space will have limited room to generalize. You'll get code that is more or less identical to the original source and you also have to wait for it to 'type' the code and the value add seems very low. I would rather ask it to point me to known Space Invaders implementations in language X on github (or search there).

Note that ChatGPT gets very nervous if I put this into GPT to clean up the grammar. It wants very badly for me to stress that LLMs don't memorize and overfitting is very unlikely (I believe neither).

replies(4): >>44724177 #>>44724233 #>>44724313 #>>44724559 #
1. ◴[] No.44724233[source]