←back to thread

235 points tosh | 1 comments | | HN request time: 0.207s | source
Show context
xanderlewis ◴[] No.40214349[source]
> Stripped of anything else, neural networks are compositions of differentiable primitives

I’m a sucker for statements like this. It almost feels philosophical, and makes the whole subject so much more comprehensible in only a single sentence.

I think François Chollet says something similar in his book on deep learning: one shouldn’t fall into the trap of anthropomorphising and mysticising models based on the ‘neural’ name; deep learning is simply the application of sequences of operations that are nonlinear (and hence capable of encoding arbitrary complexity) but nonetheless differentiable and so efficiently optimisable.

replies(12): >>40214569 #>>40214829 #>>40215168 #>>40215198 #>>40215245 #>>40215592 #>>40215628 #>>40216343 #>>40216719 #>>40216975 #>>40219489 #>>40219752 #
phkahler ◴[] No.40215628[source]
>> one shouldn’t fall into the trap of anthropomorphising and mysticising models based on the ‘neural’ name

And yet, artificial neural networks ARE an approximation of how biological neurons work. It is worth noting that they came out of neurobiology and not some math department - well at least in the forward direction, I'm not sure who came up with the training algorithms (probably the math folks). Should they be considered mystical? No. I would also posit that biological neurons are more efficient and probably have better learning algorithms than artificial ones today.

I'm confused as to why some people seem to shun the biological equivalence of these things. In a recent thread here I learned that physical synaptic weights (in our brains) are at least partly stored in DNA or its methylation. If that isn't fascinating I'm not sure what is. Or is it more along the lines of intelligence can be reduced to a large number of simple things, and biology has given us an interesting physical implementation?

replies(4): >>40215780 #>>40216482 #>>40221293 #>>40221474 #
1. chriswarbo ◴[] No.40221474[source]
> And yet, artificial neural networks ARE an approximation of how biological neurons work.

Only if you limit yourself to "sums of weighted inputs, sent through a 1D activation function".

However, the parent said "differentiable primitives": these days people have built networks that contain differentiable ray-tracers, differentiable physics simulations, etc. Those seem like crazy ideas if we limit ourselves to the "neural" analogy; but are quite natural for a "composition of differentiable primitives" approach.