Pattern-recognition by an artificial network derived from biologic neuronal systems

A novel artificial neural network, derived from neurobiological observations, is described and examples of its performance are presented. This DYnamically STable Associative Learning (DYSTAL) network associatively learns both correlations and anticorrelations, and can be configured to classify or restore patterns with only a change in the number of output units. DYSTAL exhibits some particularly desirable properties: computational effort scales linearly with the number of connections, i.e., it is0(N) in complexity; performance of the network is stable with respect to network parameters over wide ranges of their values and over the size of the input field; storage of a very large number of patterns is possible; patterns need not be orthogonal; network connections are not restricted to multi-layer feed-forward or any other specific structure; and, for a known set of deterministic input patterns, the network weights can be computed, a priori, in closed form. The network has been associatively trained to perform the XOR function as well as other classification tasks. The network has also been trained to restore patterns obscured by binary or analog noise. Neither global nor local feedback connections are required during learning; hence the network is particularly suitable for hardware (VLSI) implementation.

[1]  Teuvo Kohonen,et al.  The self-organizing feature maps , 1989 .

[2]  James L. McClelland Putting Knowledge in its Place: A Scheme for Programming Parallel Processing Structures on the Fly , 1988, Cogn. Sci..

[3]  D L Alkon,et al.  Memory storage and neural systems. , 1989, Scientific American.

[4]  James L. McClelland,et al.  Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations , 1986 .

[5]  D. Alkon Calcium-mediated reduction of ionic currents: a biophysical memory trace. , 1984, Science.

[6]  A. Klopf A neuronal model of classical conditioning , 1988 .

[7]  James L. McClelland,et al.  An interactive activation model of context effects in letter perception: part 1.: an account of basic findings , 1988 .

[8]  D. O. Hebb,et al.  The organization of behavior , 1988 .

[9]  Åke Björck,et al.  Numerical Methods , 1995, Handbook of Marine Craft Hydrodynamics and Motion Control.

[10]  Francis K. H. Quek,et al.  Computer Modeling of Associative Learning , 1988, NIPS.

[11]  Douglas L. Chute,et al.  Decreased phosphorylation of synaptic glycoproteins following hippocampal kindling , 1986, Brain Research.

[12]  D. Alkon,et al.  A spatial-temporal model of cell activation. , 1988, Science.

[13]  Geoffrey E. Hinton,et al.  Learning symmetry groups with hidden units: beyond the perceptron , 1986 .

[14]  A. K. Rigler,et al.  Accelerating the convergence of the back-propagation method , 1988, Biological Cybernetics.

[15]  Terrence J. Sejnowski,et al.  NETtalk: a parallel network that learns to read aloud , 1988 .

[16]  Stephen Grossberg,et al.  Studies of mind and brain , 1982 .

[17]  James A. Anderson,et al.  Cognitive and psychological computation with neural models , 1983, IEEE Transactions on Systems, Man, and Cybernetics.

[18]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[19]  D L Alkon,et al.  Enhancement of synaptic potentials in rabbit CA1 pyramidal neurons following classical conditioning. , 1988, Proceedings of the National Academy of Sciences of the United States of America.

[20]  D L Alkon,et al.  Imaging of memory-specific changes in the distribution of protein kinase C in the hippocampus. , 1989, Science.

[21]  D. Alkon,et al.  Classical conditioning reduces amplitude and duration of calcium-dependent afterhyperpolarization in rabbit hippocampal pyramidal cells. , 1989, Journal of neurophysiology.

[22]  H. Akaike On a successive transformation of probability distribution and its application to the analysis of the optimum gradient method , 1959 .

[23]  James L. McClelland,et al.  An interactive activation model of context effects in letter perception: I. An account of basic findings. , 1981 .

[24]  Geoffrey E. Hinton,et al.  Separating Figure from Ground with a Parallel Network , 1986, Perception.

[25]  M. W. Roth Neural-network techology and its applications , 1988 .

[26]  J. Hopfield,et al.  Computing with neural circuits: a model. , 1986, Science.

[27]  D L Alkon,et al.  Classical conditioning induces long-term translocation of protein kinase C in rabbit hippocampal CA1 cells. , 1988, Proceedings of the National Academy of Sciences of the United States of America.

[28]  Kunihiko Fukushima,et al.  Neocognitron: A hierarchical neural network capable of visual pattern recognition , 1988, Neural Networks.

[29]  Daniel L. Alkon,et al.  Memory Traces in the Brain , 1987 .

[30]  James A. Anderson Cognitive Capabilities of a Parallel System , 1986 .

[31]  D. Alkon Learning in a marine snail. , 1983, Scientific American.

[32]  William H. Press,et al.  Numerical recipes , 1990 .

[33]  Yunshyong Chow,et al.  On Eigenvalues and Annealing Rates , 1988, Math. Oper. Res..

[34]  Stephen Grossberg,et al.  Competitive Learning: From Interactive Activation to Adaptive Resonance , 1987, Cogn. Sci..

[35]  Françoise Fogelman-Soulié,et al.  Disordered Systems and Biological Organization , 1986, NATO ASI Series.

[36]  David Zipser,et al.  Feature Discovery by Competive Learning , 1986, Cogn. Sci..

[37]  D. Alkon,et al.  Conditioning-specific membrane changes of rabbit hippocampal neurons measured in vitro. , 1986, Proceedings of the National Academy of Sciences of the United States of America.