We display a synaptic matrix that can efficiently store, in attractor neural networks (AMN) and perceptrons, patterns organized in uncorrelated classes. We find a storage capacity limit increasing with m, the overlap of a pattern with its class ancestor, and diverging as m→1. The probability distributions of the local stability parameters is studied, leading to a complete analysis of the performance of a perceptron with this synaptic matrix, and to a qualitative understanding of the behavior of the corresponding AMN. The analysis of the retrieval attractor of the ANN is completed via statistical mechanics. The motivation for the construction of this matrix was to make possible a study of a model for prosopagnosia, i.e. the shift from individual to class recall, under lesion, i.e. a rand On met en evidence une matrice synaptique qui stocke efficacement les patterns organises en categories non correlees dans les reseaux neuronaux a attracteurs et les perceptrons. La capacite de stockage limite augmente avec le recouvrement m d'un pattern avec sa categorie ancestrale, et diverge lorsque m tend vers 1. La distribution de probabilite des parametres de stabilite locaux est etudiee, et conduit a une analyse complete des performances d'un perceptron en fonction de sa matrice synaptique, ainsi qu'a une comprehension qualitative du comportement du reseau neuronal correspondant. L'analyse de l'attracteur du reseau est completee a l'aide de la mecanique statistique. La motivation d'une telle construction est de rendre possible l'etude d'un modele de prosopagnosie: le passage du rappel individuel a celui de categories lors de lesions, c'est-a-dire d'une deterioration aleatoire des efficacites synaptiques. Les proprietes de rappel du modele en fonction de la matrice synaptique proposee sont etudiees en detail. Enfin nous comparons notre matrice synaptique a une matrice generique dont tous les parametres de stabilite sont positifs
[1]
M. Tsodyks,et al.
The Enhanced Storage Capacity in Neural Networks with Low Activity Level
,
1988
.
[2]
Thomas B. Kepler,et al.
Domains of attraction in neural networks
,
1988
.
[3]
J. Buhmann,et al.
Associative memory with high information content.
,
1989,
Physical review. A, General physics.
[4]
I. Morgenstern,et al.
Heidelberg Colloquium on Glassy Dynamics
,
1987
.
[5]
J J Hopfield,et al.
Neural networks and physical systems with emergent collective computational abilities.
,
1982,
Proceedings of the National Academy of Sciences of the United States of America.
[6]
D. Amit,et al.
Optimised network for sparsely coded patterns
,
1989
.
[7]
Néstor Parga,et al.
The ultrametric organization of memories in a neural network
,
1986
.
[8]
Gérard Weisbuch,et al.
Scaling laws for the attractors of Hopfield networks
,
1985
.
[9]
H. C. LONGUET-HIGGINS,et al.
Non-Holographic Associative Memory
,
1969,
Nature.
[10]
Marc Mézard,et al.
Basins of Attraction in a Perception-like Neural Network
,
1988,
Complex Syst..
[11]
M. Virasoro,et al.
Perceptron beyond the limit of capacity
,
1989
.
[12]
Sompolinsky,et al.
Storing infinite numbers of patterns in a spin-glass model of neural networks.
,
1985,
Physical review letters.