I kinda getting back to the place now where I want to revisit The Organization of Behavior. That's the seminal work by Hebb that introduced Hebbian Learning and I'm on this big quest now to revisit a lot of old school approaches to learning & neural networks (in something at least approximating chronological order, although I won't be super strict about it) and code up implementations of each. So basically, some sort of Hebbian Learning system, a "McCulloch & Pitts Neuron", a Perceptron with the Perceptron Convergence Algorithm, Selfridge's Pandemonium Architecture, and so on, gradually working my way up to the current SOTA.
I'm about to finish up the Minsky & Papert Perceptrons book, and once I finish that I will probably read Volume 2 of the Parallel Distributed Processing series, then go back to Hebb.
FWIW, that Memory book was pretty fascinating. The general subject of human memory is, both simply taken for its own sake, and taken as inspiration for approaches to AI. I'm slightly more interested in AI than human memory qua human memory, but in either case it's fascinating material.