Neural Networks for Pattern Recognition

[1]  Heekuck Oh,et al.  A pseudo-relaxation learning algorithm for bidirectional associative memory , 1992, [Proceedings 1992] IJCNN International Joint Conference on Neural Networks.

[2]  Heekuck Oh,et al.  A new learning approach to enhance the storage capacity of the Hopfield model , 1991, [Proceedings] 1991 IEEE International Joint Conference on Neural Networks.

[3]  Jose B. Cruz,et al.  Guaranteed recall of all training pairs for bidirectional associative memory , 1991, IEEE Trans. Neural Networks.

[4]  S. Sitharama Iyengar,et al.  Neurocomputing Formalisms for Computational Learning and Machine Intelligence , 1991, Adv. Comput..

[5]  K Fukushima,et al.  Handwritten alphanumeric character recognition by the neocognitron , 1991, IEEE Trans. Neural Networks.

[6]  Jehoshua Bruck On the convergence properties of the Hopfield model , 1990, Proc. IEEE.

[7]  Jose B. Cruz,et al.  Two coding strategies for bidirectional associative memory , 1990, IEEE Trans. Neural Networks.

[8]  Mohamad T. Musavi,et al.  A neural network approach to character recognition , 1989, Neural Networks.

[9]  Bruno Cernuschi-Frías,et al.  Partial simultaneous updating in Hopfield memories , 1989, IEEE Trans. Syst. Man Cybern..

[10]  Mohamad H. Hassoun Dynamic heteroassociative neural memories , 1989, Neural Networks.

[11]  Gail A. Carpenter,et al.  Neural network models for pattern recognition and associative memory , 1989, Neural Networks.

[12]  Alex Waibel,et al.  Consonant recognition by modular construction of large phonemic time-delay neural networks , 1989, International Conference on Acoustics, Speech, and Signal Processing,.

[13]  J. Slawny,et al.  Back propagation fails to separate where perceptrons succeed , 1989 .

[14]  Geoffrey E. Hinton,et al.  Phoneme recognition using time-delay neural networks , 1989, IEEE Trans. Acoust. Speech Signal Process..

[15]  Robert Hecht-Nielsen,et al.  Theory of the backpropagation neural network , 1989, International 1989 Joint Conference on Neural Networks.

[16]  Eduardo D. Sontag,et al.  Backpropagation Can Give Rise to Spurious Local Minima Even for Networks without Hidden Layers , 1989, Complex Syst..

[17]  Christian Lebiere,et al.  The Cascade-Correlation Learning Architecture , 1989, NIPS.

[18]  John J. Hopfield Collective computation, content-addressable memory, and optimization problems , 1988 .

[19]  Sweet Determination of parameters in a Hopfield/Tank computational network , 1988 .

[20]  B. Kosko,et al.  Feedback stability and unsupervised learning , 1988, IEEE 1988 International Conference on Neural Networks.

[21]  Robert Hecht-Nielsen,et al.  A BAM with increased information storage capacity , 1988, IEEE 1988 International Conference on Neural Networks.

[22]  Alex Waibel,et al.  Phoneme recognition: neural networks vs. hidden Markov models vs. hidden Markov models , 1988, ICASSP-88., International Conference on Acoustics, Speech, and Signal Processing.

[23]  B. Forrest Content-addressability and learning in neural networks , 1988 .

[24]  E. Gardner,et al.  Optimal storage properties of neural network models , 1988 .

[25]  BART KOSKO,et al.  Bidirectional associative memories , 1988, IEEE Trans. Syst. Man Cybern..

[26]  Bernard Widrow,et al.  Adaptive switching circuits , 1988 .

[27]  Kunihiko Fukushima,et al.  Neocognitron: A hierarchical neural network capable of visual pattern recognition , 1988, Neural Networks.

[28]  D. J. Wallace,et al.  Implementing Neural Network Models on Parallel Computers , 1987, Comput. J..

[29]  Santosh S. Venkatesh,et al.  The capacity of the Hopfield associative memory , 1987, IEEE Trans. Inf. Theory.

[30]  Bart Kosko,et al.  Optical Bidirectional Associative Memories , 1987, Photonics West - Lasers and Applications in Science and Engineering.

[31]  Richard Durbin,et al.  An analogue approach to the travelling salesman problem using an elastic net method , 1987, Nature.

[32]  R. Lippmann,et al.  An introduction to computing with neural nets , 1987, IEEE ASSP Magazine.

[33]  B. Kosco Differential Hebbian learning , 1987 .

[34]  Wr Jeffrey,et al.  Neural network processing as a tool for function optimization , 1987 .

[35]  D. Kleinfeld,et al.  "Unlearning" increases the storage capacity of content addressable memories. , 1987, Biophysical journal.

[36]  Kanter,et al.  Associative recall of memory without errors. , 1987, Physical review. A, General physics.

[37]  Gene A. Tagliarini,et al.  A Neural-Network Solution to the Concentrator Assignment Problem , 1987, NIPS.

[38]  A. Harry Klopf,et al.  A drive-reinforcement model of single neuron function , 1987 .

[39]  Geoffrey E. Hinton,et al.  Learning symmetry groups with hidden units: beyond the perceptron , 1986 .

[40]  John J. Hopfield,et al.  Simple 'neural' optimization networks: An A/D converter, signal decision circuit, and a linear programming circuit , 1986 .

[41]  R. Golden The :20Brain-state-in-a-box Neural model is a gradient descent algorithm , 1986 .

[42]  Geoffrey E. Hinton,et al.  Learning and relearning in Boltzmann machines , 1986 .

[43]  James L. McClelland,et al.  Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations , 1986 .

[44]  L. Rabiner,et al.  An introduction to hidden Markov models , 1986, IEEE ASSP Magazine.

[45]  Eric Goles Ch.,et al.  Dynamics of Positive Automata Networks , 1985, Theor. Comput. Sci..

[46]  Eric Goles Ch.,et al.  Decreasing energy functions as a tool for studying threshold networks , 1985, Discret. Appl. Math..

[47]  Gérard Weisbuch,et al.  Scaling laws for the attractors of Hopfield networks , 1985 .

[48]  Yaser S. Abu-Mostafa,et al.  Information capacity of the Hopfield model , 1985, IEEE Trans. Inf. Theory.

[49]  I. Guyon,et al.  Information storage and retrieval in spin-glass like neural networks , 1985 .

[50]  Geoffrey E. Hinton,et al.  A Learning Algorithm for Boltzmann Machines , 1985, Cogn. Sci..

[51]  Leslie G. Valiant,et al.  A theory of the learnable , 1984, STOC '84.

[52]  J J Hopfield,et al.  Neurons with graded response have collective computational properties like those of two-state neurons. , 1984, Proceedings of the National Academy of Sciences of the United States of America.

[53]  Francis Crick,et al.  The function of dream sleep , 1983, Nature.

[54]  J. J. Hopfield,et al.  ‘Unlearning’ has a stabilizing effect in collective memories , 1983, Nature.

[55]  J J Hopfield,et al.  Neural networks and physical systems with emergent collective computational abilities. , 1982, Proceedings of the National Academy of Sciences of the United States of America.

[56]  A G Barto,et al.  Toward a modern theory of adaptive networks: expectation and prediction. , 1981, Psychological review.

[57]  T. Sejnowski,et al.  Storing covariance with nonlinearly interacting neurons , 1977, Journal of mathematical biology.

[58]  S. Amari,et al.  A Mathematical Foundation for Statistical Neurodynamics , 1977 .

[59]  Shun-ichi Amari,et al.  Learning Patterns and Pattern Sequences by Self-Organizing Nets of Threshold Elements , 1972, IEEE Transactions on Computers.

[60]  S. Grossberg On learning and energy-entropy dependence in recurrent and nonrecurrent signed networks , 1969 .

[61]  S Grossberg,et al.  Some nonlinear networks capable of learning a spatial pattern of arbitrary complexity. , 1968, Proceedings of the National Academy of Sciences of the United States of America.

[62]  Thomas M. Cover,et al.  Geometrical and Statistical Properties of Systems of Linear Inequalities with Applications in Pattern Recognition , 1965, IEEE Trans. Electron. Comput..

[63]  S. Agmon The Relaxation Method for Linear Inequalities , 1954, Canadian Journal of Mathematics.

[64]  I. J. Schoenberg,et al.  The Relaxation Method for Linear Inequalities , 1954, Canadian Journal of Mathematics.