←back to thread

156 points cpldcpu | 1 comments | | HN request time: 0.202s | source
Show context
magicalhippo ◴[] No.41892062[source]
Fun to see neural nets pushed to such extremes, really enjoyed the post.

> The smallest models had to be trained without data augmentation, as they would not converge otherwise.

Was this also the case for the 2-bit model you ended up with?

replies(1): >>41893238 #
1. cpldcpu ◴[] No.41893238[source]
Yes, as far as i remember the limit was somewhere around 1kbyte total parameters size.