Mutual information of sparsely coded associative memory with self-control and ternary neurons

The influence of a macroscopic time-dependent threshold on the retrieval dynamics of attractor associative memory models with ternary neurons ¿-1, 0, +1¿ is examined. If the threshold is chosen appropriately as a function of the cross-talk noise and of the activity of the memorized patterns in the model, adapting itself in the course of the time evolution, it guarantees an autonomous functioning of the model. Especially in the limit of sparse coding, it is found that this self-control mechanism considerably improves the quality of the fixed-point retrieval dynamics, in particular the storage capacity, the basins of attraction and the information content. The mutual information is shown to be the relevant parameter to study the retrieval quality of such sparsely coded models. Numerical results confirm these observations.

[1]  Masato Okada,et al.  A hierarchy of macrodynamical equations for associative memory , 1995, Neural Networks.

[2]  E. Capaldi,et al.  The organization of behavior. , 1992, Journal of applied behavior analysis.

[3]  Désiré Bollé,et al.  Retrieval and chaos in extremely dilutedQ-Ising neural networks , 1994 .

[4]  Jonathan S. Yedidia Neural networks that use three-state neurons , 1989 .

[5]  Simon Schultz,et al.  Stability of the replica symmetric solution for the information conveyed by a neural network , 1998 .

[6]  J. Buhmann,et al.  Associative memory with high information content. , 1989, Physical review. A, General physics.

[7]  Désiré Bollé,et al.  On the parallel dynamics of theQ-state Potts andQ-Ising neural networks , 1993 .

[8]  Katsunori Kitano,et al.  LETTER TO THE EDITOR: Retrieval dynamics of neural networks for sparsely coded sequential patterns , 1998, cond-mat/9805135.

[9]  Masato Okada,et al.  Notions of Associative Memory and Sparse Coding , 1996, Neural Networks.

[10]  J. Nadal,et al.  Nonlinear feedforward networks with stochastic outputs: infomax implies redundancy reduction. , 1998, Network.

[11]  Engel,et al.  Basin of attraction in networks of multistate neurons. , 1993, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[12]  M. Tsodyks Associative Memory in Asymmetric Diluted Network with Low Level of Activity , 1988 .

[13]  C. J. Perez-Vicente Sparse Coding and Information in Hebbian Neural Networks , 1989 .

[14]  Anders Krogh,et al.  Introduction to the theory of neural computation , 1994, The advanced book program.

[15]  H. C. LONGUET-HIGGINS,et al.  Non-Holographic Associative Memory , 1969, Nature.

[16]  H. Horner Neural networks with low levels of activity: Ising vs. McCulloch-Pitts neurons , 1989 .

[17]  Richard E. Blahut,et al.  Principles and practice of information theory , 1987 .

[18]  Günther Palm,et al.  Iterative retrieval of sparsely coded associative memory patterns , 1996, Neural Networks.

[19]  E. Gardner,et al.  An Exactly Solvable Asymmetric Neural Network Model , 1987 .

[20]  Horn,et al.  Neural networks with dynamical thresholds. , 1989, Physical review. A, General physics.

[21]  Désiré Bollé,et al.  Dynamics of temporal activity in multi-state neural networks , 1997 .

[22]  S. Kaplan The Physiology of Thought , 1950 .

[23]  Jeferson Jacob Arenzon,et al.  Simulating highly diluted neural networks , 1994 .

[24]  D.R.C.Dominguez,et al.  Self-control in Sparsely Coded Networks , 1998 .

[25]  Sompolinsky,et al.  Information storage in neural networks with low levels of activity. , 1987, Physical review. A, General physics.

[26]  Shun-ichi Amari,et al.  Characteristics of sparsely encoded associative memory , 1989, Neural Networks.

[27]  H Rieger Storing an extensive number of grey-toned patterns in a neural network using multistate neurons , 1990 .

[28]  Shun-ichi Amari,et al.  Statistical neurodynamics of associative memory , 1988, Neural Networks.