Generalization of back-propagation to recurrent neural networks.
暂无分享,去创建一个
An adaptive neural network with asymmetric connections is introduced. This network is related to the Hopfield network with graded neurons and uses a recurrent generalization of the \ensuremath{\delta} rule of Rumelhart, Hinton, and Williams to modify adaptively the synaptic weights. The new network bears a resemblance to the master/slave network of Lapedes and Farber but it is architecturally simpler.
[1] S. Amari,et al. Characteristics of Random Nets of Analog Neuron-Like Elements , 1972, IEEE Trans. Syst. Man Cybern..
[2] J J Hopfield,et al. Neurons with graded response have collective computational properties like those of two-state neurons. , 1984, Proceedings of the National Academy of Sciences of the United States of America.
[3] James L. McClelland,et al. Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations , 1986 .