←back to thread

235 points tosh | 1 comments | | HN request time: 0.535s | source
Show context
xanderlewis ◴[] No.40214349[source]
> Stripped of anything else, neural networks are compositions of differentiable primitives

I’m a sucker for statements like this. It almost feels philosophical, and makes the whole subject so much more comprehensible in only a single sentence.

I think François Chollet says something similar in his book on deep learning: one shouldn’t fall into the trap of anthropomorphising and mysticising models based on the ‘neural’ name; deep learning is simply the application of sequences of operations that are nonlinear (and hence capable of encoding arbitrary complexity) but nonetheless differentiable and so efficiently optimisable.

replies(12): >>40214569 #>>40214829 #>>40215168 #>>40215198 #>>40215245 #>>40215592 #>>40215628 #>>40216343 #>>40216719 #>>40216975 #>>40219489 #>>40219752 #
jonas21 ◴[] No.40219489[source]
I feel like this statement is both obvious after spending a few minutes working with neural networks and completely useless in helping you build better neural networks.

It's kind of like saying, "Stripped of anything else, works of literature are compositions of words"

replies(2): >>40222940 #>>40225445 #
1. Horffupolde ◴[] No.40222940[source]
Well, I’d argue that could also be a bit enlightening. It’s like taking a moment to appreciate that forests are composed of single trees. It takes a certain level of insight to appreciate systems at various depths.