Learning internal representations in an attractor neural network with analogue neurons

A leaming amtor neural network (U") with a double dynamics of neual activities and synaptic efficacies, operating on two different timescales is studied by simulations in preparation for an electronic implementation. The present network includes several qnasi- realistic features: neurons me represented by their afferent currents and output spike rates; excitatory and inhibitory neurons are separated; amactor spike rates as well as coding levels in arriving stimuli are low; learning lakes place only between excitatory units. Synaptic dynamics is an unsupervised, analogue Hebbian process, but long term memory in the absence of neural activity is maintained by a refresh mechanism which on long timescales discretizes the synaptic values. converting leaning into asynchronous stochastic process induced by the stimuli on the synaptic efficacies. This network is intended to lean a set of amactors from the statistics of freely arriving stimuli, which are represented by extemal synaptic inputs injected into the excitatory neurons. In the simulations different types of sequences of many thousands of stimuli are presented to Ihe network, without distinguishing in the dynamics a leaming phase from retrieval. Stimulus sequences differ in pre-assigned global statistics (including time-dependent statistics); in orders of presentation of individual stimuli within a given statistics: in lengths of time intervals for each presentation and in the intervals separating one stimulus from another. We find that the network effectively learns a set of attractors representing the statistics of the stimuli. and is able to modify its amactors when the input staristics change. Moreover, as the global input statistics changes the network can also forget attracton related to stimulus classes no longer presented. Forgetting takes place only due to the @val of new stimuli. The performance of the network and the statistics of he amactors are studied as a function of the inmt statistics. Mast of the --scale characteristics of the leamina dynamics can be captured . -. theoretically. This model modifies a previous implementation of a LA" composed of discrete neurons. in a nerwork of more redistic neurons. The different elements have been designed to facilitate their implementation in silicon.

[1]  N. Brunel Storage capacity of neural networks: effect of the fluctuations of the number of active neurons per memory , 1994 .

[2]  H. C. LONGUET-HIGGINS,et al.  Non-Holographic Associative Memory , 1969, Nature.

[3]  D. Amit,et al.  Quantitative study of attractor neural networks retrieving at low spike rates: II. Low-rate retrieval in symmetric networks , 1991 .

[4]  Davide Badoni,et al.  Learning Attractor Neural Network: The Electronic Implementation , 1992, Int. J. Neural Syst..

[5]  D. Marr A theory for cerebral neocortex , 1970, Proceedings of the Royal Society of London. Series B. Biological Sciences.

[6]  Daniel J. Amit,et al.  Quantitative Study of Attractor Neural Network Retrieving at Low Spike Rates: I , 1991 .

[7]  T. Sejnowski,et al.  Associative long-term depression in the hippocampus induced by hebbian covariance , 1989, Nature.

[8]  Davide Badoni,et al.  Electronic implementation of an analogue attractor neural network with stochastic learning , 1995 .

[9]  Y. Miyashita,et al.  Neural organization for the long-term memory of paired associates , 1991, Nature.

[10]  Daniel J. Amit,et al.  Learning in Neural Networks with Material Synapses , 1994, Neural Computation.

[11]  Daniel J. Amit,et al.  E ective neurons and attractor neural networks in cortical environment , 1992 .

[12]  J. Nadal,et al.  Associative memory: on the (puzzling) sparse coding limit , 1991 .

[13]  P. Goldman-Rakic,et al.  Dissociation of object and spatial processing domains in primate prefrontal cortex. , 1993, Science.

[14]  J. Hopfield,et al.  Dynamic properties of neural networks with adapting synapses , 1992 .

[15]  Daniel J. Amit,et al.  In defence of single-electrode recordings , 1992 .

[16]  D. Amit,et al.  Constraints on learning in dynamic synapses , 1992 .

[17]  Nicolas Brunel,et al.  Adequate input for learning in attractor neural networks , 1993 .

[18]  Nicolas Brunel,et al.  Hebbian Learning of Context in Recurrent Neural Networks , 1996, Neural Computation.

[19]  Jean-Pierre Nadal,et al.  Information storage in sparsely coded memory nets , 1990 .

[20]  Y. Miyashita Neuronal correlate of visual associative long-term memory in the primate temporal cortex , 1988, Nature.

[21]  Y. Miyashita,et al.  Neuronal correlate of pictorial short-term memory in the primate temporal cortexYasushi Miyashita , 1988, Nature.