←back to thread

156 points cpldcpu | 2 comments | | HN request time: 0.388s | source
1. magicalhippo ◴[] No.41892062[source]
Fun to see neural nets pushed to such extremes, really enjoyed the post.

> The smallest models had to be trained without data augmentation, as they would not converge otherwise.

Was this also the case for the 2-bit model you ended up with?

replies(1): >>41893238 #
2. cpldcpu ◴[] No.41893238[source]
Yes, as far as i remember the limit was somewhere around 1kbyte total parameters size.