Learning and memory properties in fully connected networks

This paper summarises recent results of theoretical analysis and numerical simulation, in fully connected networks of the Little‐Hopfield class. The theoretical analysis is based on methods of statistical mechanics as applied to spin‐glass problems, and the numerical work involves massively parallel simulations on the ICL Distributed Array Processor (DAP). Specific applications include: (i) exact results for the fraction of nominal vectors which are perfectly stored by the usual Hebbian rule; (ii) a numerical estimate of the position of the second phase transition in the Hopfield model, at which there is effectively total loss of memory capacity; (iii) a numerical study of the nature of the spurious states in the model; (iv) an exploration of the performance of a learning algorithm, including the exact storage of up to 512 (random) nominal vectors in a 512 node model; (v) a theoretical study of the phase transitions in generalizations where the energy function is a monomial in the state vectors.