Storage Capacity of the Hopfield Network Associative Memory

The Hop field model is a well-known dynamic associative-memory model. In this paper, we investigate various aspects of the Hop field model for associative memory. We conduct a systematic simulation investigation of several storage algorithms for Hop field networks, and conclude that the perceptron learning based storage algorithms can achieve much better storage capacity than the Hebbian learning based algorithms.

[1]  Bill Baird,et al.  Associative Memory in a Simple Model of Oscillating Cortex , 1989, NIPS.

[2]  Stephen A. Ritz,et al.  Distinctive features, categorical perception, and probability learning: some applications of a neural model , 1977 .

[3]  R. Lippmann,et al.  An introduction to computing with neural nets , 1987, IEEE ASSP Magazine.

[4]  Jinwen Ma The Stability of the Generalized Hopfield Networks in Randomly Asynchronous Mode , 1997, Neural Networks.

[5]  Masahiko Morita,et al.  Capacity of associative memory using a nonmonotonic neuron model , 1993, Neural Networks.

[6]  Stephen Coombes,et al.  Using generalized principal component analysis to achieve associative memory in a Hopfield net , 1994 .

[7]  J. J. Hopfield,et al.  “Neural” computation of decisions in optimization problems , 1985, Biological Cybernetics.

[8]  Teuvo Kohonen,et al.  Self-organization and associative memory: 3rd edition , 1989 .

[9]  Martin Hasler,et al.  Recursive neural networks for associative memory , 1990, Wiley-interscience series in systems and optimization.

[10]  Bing J. Sheu,et al.  Design and analysis of analog VLSI neural networks , 1992 .

[11]  Iku Nemoto,et al.  Complex associative memory , 1996, Neural Networks.

[12]  Santosh S. Venkatesh,et al.  The capacity of the Hopfield associative memory , 1987, IEEE Trans. Inf. Theory.

[13]  Masahiko Morita,et al.  Associative memory with nonmonotone dynamics , 1993, Neural Networks.

[14]  Rodney M. Goodman,et al.  Recurrent correlation associative memories , 1991, IEEE Trans. Neural Networks.

[15]  John J. Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities , 1999 .

[16]  B Kosko,et al.  Adaptive bidirectional associative memories. , 1987, Applied optics.

[17]  Tony R. Martinez,et al.  Quantum associative memory , 2000, Inf. Sci..

[18]  Geoffrey E. Hinton,et al.  A Learning Algorithm for Boltzmann Machines , 1985, Cogn. Sci..

[19]  Personnaz,et al.  Collective computational properties of neural networks: New learning mechanisms. , 1986, Physical review. A, General physics.

[20]  Jehoshua Bruck On the convergence properties of the Hopfield model , 1990, Proc. IEEE.

[21]  Teuvo Kohonen,et al.  Correlation Matrix Memories , 1972, IEEE Transactions on Computers.

[22]  S Coombes,et al.  Efficient learning beyond saturation by single-layered neural networks , 1996 .

[23]  Soo-Young Lee,et al.  An Optimization Network for Matrix Inversion , 1987, NIPS.

[24]  Richard P. Lippmann,et al.  An introduction to computing with neural nets , 1987 .

[25]  Kanter,et al.  Associative recall of memory without errors. , 1987, Physical review. A, General physics.

[26]  Jinwen Ma,et al.  The asymptotic memory capacity of the generalized Hopfield network , 1999, Neural Networks.

[27]  Yaser S. Abu-Mostafa,et al.  Information capacity of the Hopfield model , 1985, IEEE Trans. Inf. Theory.

[28]  Demetri Psaltis,et al.  Linear and logarithmic capacities in associative neural networks , 1989, IEEE Trans. Inf. Theory.

[29]  A.N. Michel,et al.  Analysis and synthesis of a class of discrete-time neural networks described on hypercubes , 1991, IEEE Trans. Neural Networks.

[30]  W. Little The existence of persistent states in the brain , 1974 .

[31]  James A. Anderson,et al.  A simple neural network generating an interactive memory , 1972 .

[32]  Mahesan Niranjan,et al.  A theoretical investigation into the performance of the Hopfield model , 1990, IEEE Trans. Neural Networks.

[33]  E. Gardner The space of interactions in neural network models , 1988 .

[34]  Shun-ichi Amari,et al.  Learning Patterns and Pattern Sequences by Self-Organizing Nets of Threshold Elements , 1972, IEEE Transactions on Computers.

[35]  A. Michel,et al.  Analysis and synthesis of a class of neural networks: linear systems operating on a closed hypercube , 1989 .

[36]  Amos J. Storkey,et al.  Increasing the Capacity of a Hopfield Network without Sacrificing Functionality , 1997, ICANN.

[37]  J J Hopfield,et al.  Neurons with graded response have collective computational properties like those of two-state neurons. , 1984, Proceedings of the National Academy of Sciences of the United States of America.

[38]  Raúl Rojas,et al.  Neural Networks - A Systematic Introduction , 1996 .

[39]  Jacek Mandziuk,et al.  Experimental Study of Perceptron-Type Local Learning Rule for Hopfield Associative Memory , 1998, Inf. Sci..

[40]  M.N.S. Swamy,et al.  Neural networks in a softcomputing framework , 2006 .

[41]  Jehoshua Bruck,et al.  On the number of spurious memories in the Hopfield model , 1990, IEEE Trans. Inf. Theory.

[42]  Kei Kobayashi On the capacity of a neuron with a non-monotone output function , 1991 .

[43]  Amos Storkey,et al.  Hopfield learning rule with high capacity storage of time-correlated patterns , 1997 .