←back to thread

295 points rttti | 1 comments | | HN request time: 0s | source
Show context
CGMthrowaway ◴[] No.45119871[source]
Honest feedback - I was really excited when I read the opening. However, I did not come away from this without a greater understanding than I already had.

For reference, my initial understanding was somewhat low: basically I know a) what embedding is basically b) transformers work by matrix multiplication, and c) it's something like a multi-threaded Markov chain generator with the benefit of prior-trained embeddings

replies(8): >>45120114 #>>45120200 #>>45122565 #>>45123711 #>>45125243 #>>45128482 #>>45129469 #>>45134872 #
quitit ◴[] No.45125243[source]
I had a similar feeling, I think a little magic was lost by the author trying to be as concise as possible, which is no real fault of their own as it can go down the rabbit hole very quickly.

Instead I believe this might work better as a guided exercise where a person can work on it over a few hours rather than being spoon-fed it over the 10 minute reading time. Or breaking up the steps into "interactive" sections that more clearly demarcate the stages.

Regardless I'm very supportive of people making efforts to simplify this topic, each attempt always gives me something that I either forgot or neglect.

replies(1): >>45134880 #
1. rttti ◴[] No.45134880[source]
Thanks a lot for your feedback. I like your idea. This matches the pattern that you learn best what you try and experience yourself.