The capacity of the Hopfield associative memory

Techniques from coding theory are applied to study rigorously the capacity of the Hopfield associative memory. Such a memory stores n -tuple of \pm 1 's. The components change depending on a hard-limited version of linear functions of all other components. With symmetric connections between components, a stable state is ultimately reached. By building up the connection matrix as a sum-of-outer products of m fundamental memories, one hopes to be able to recover a certain one of the m memories by using an initial n -tuple probe vector less than a Hamming distance n/2 away from the fundamental memory. If m fundamental memories are chosen at random, the maximum asympotic value of m in order that most of the m original memories are exactly recoverable is n/(2 \log n) . With the added restriction that every one of the m fundamental memories be recoverable exactly, m can be no more than n/(4 \log n) asymptotically as n approaches infinity. Extensions are also considered, in particular to capacity under quantization of the outer-product connection matrix. This quantized memory capacity problem is closely related to the capacity of the quantized Gaussian channel.

[1]  W. Ashby,et al.  The Neurophysiological Basis of Mind , 1953 .

[2]  William Feller,et al.  An Introduction to Probability Theory and Its Applications , 1967 .

[3]  Kaoru Nakano,et al.  Associatron-A Model of Associative Memory , 1972, IEEE Trans. Syst. Man Cybern..

[4]  W. Little The existence of persistent states in the brain , 1974 .

[5]  Teuvo Kohonen,et al.  Associative memory. A system-theoretical approach , 1977 .

[6]  Robert J. McEliece,et al.  The Theory of Information and Coding , 1979 .

[7]  W. A. Little,et al.  Analytic study of the memory storage capacity of a neural network , 1978 .

[8]  Stephen Grossberg,et al.  Studies of mind and brain , 1982 .

[9]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[10]  W. J. Thron,et al.  Encyclopedia of Mathematics and its Applications. , 1982 .

[11]  S. Wolfram Statistical mechanics of cellular automata , 1983 .

[12]  M. Mézard,et al.  The simplest spin glass , 1984 .

[13]  R. McEliece,et al.  The number of stable points of an infinite-range spin glass memory , 1985 .

[14]  Yaser S. Abu-Mostafa,et al.  Information capacity of the Hopfield model , 1985, IEEE Trans. Inf. Theory.

[15]  John J. Hopfield,et al.  Simple 'neural' optimization networks: An A/D converter, signal decision circuit, and a linear programming circuit , 1986 .

[16]  J. Austin Associative memory , 1987 .

[17]  S. Venkatesh Epsilon capacity of neural networks , 1987 .

[18]  Geoffrey E. Hinton,et al.  Parallel Models of Associative Memory , 1989 .