←back to thread

217 points amazonhut | 2 comments | | HN request time: 1.245s | source
Show context
Snuggly73 ◴[] No.45248425[source]
Congrats - there is a very small problem with the LLM - its reusing transformer blocks and you want to use different instances of them.

Its a very cool excercise, I did the same with Zig and MLX a while back, so I can get a nice foundation, but since then as I got hooked and kept adding stuff to it, switched to Pytorch/Transformers.

replies(1): >>45248449 #
1. icemanx ◴[] No.45248449[source]
correction: It's a cool exercise if you write it yourself and not use GPT
replies(1): >>45248481 #
2. Snuggly73 ◴[] No.45248481[source]
well, hopefully the author did learn something or at least enjoyed the process :)

(the code looks like a very junior or a non-dev wrote it tbh).