Optimal convergence of on-line backpropagation

Many researchers are quite skeptical about the actual behavior of neural network learning algorithms like backpropagation. One of the major problems is with the lack of clear theoretical results on optimal convergence, particularly for pattern mode algorithms. In this paper, we prove the companion of Rosenblatt's PC (perceptron convergence) theorem for feedforward networks (1960), stating that pattern mode backpropagation converges to an optimal solution for linearly separable patterns.

[1]  Marco Gori,et al.  On the problem of local minima in recurrent neural networks , 1994, IEEE Trans. Neural Networks.

[2]  Lawrence D. Jackel,et al.  Handwritten Digit Recognition with a Back-Propagation Network , 1989, NIPS.

[3]  Alberto Tesi,et al.  On the Problem of Local Minima in Backpropagation , 1992, IEEE Trans. Pattern Anal. Mach. Intell..

[4]  Amir F. Atiya,et al.  An accelerated learning algorithm for multilayer perceptron networks , 1994, IEEE Trans. Neural Networks.

[5]  F. Jordan,et al.  Using the symmetries of a multi-layered network to reduce the weight space , 1991, IJCNN-91-Seattle International Joint Conference on Neural Networks.

[6]  Dominique Martinez,et al.  On a novel unsupervised competitive learning algorithm for scalar quantization , 1994, IEEE Trans. Neural Networks.

[7]  Paolo Frasconi,et al.  Learning without local minima in radial basis function networks , 1995, IEEE Trans. Neural Networks.

[8]  Eduardo D. Sontag,et al.  Backpropagation separates when perceptrons do , 1989, International 1989 Joint Conference on Neural Networks.

[9]  Marvin Minsky,et al.  Perceptrons: expanded edition , 1988 .

[10]  G. Kane Parallel Distributed Processing: Explorations in the Microstructure of Cognition, vol 1: Foundations, vol 2: Psychological and Biological Models , 1994 .

[11]  Ching-Chi Hsu,et al.  Terminal attractor learning algorithms for back propagation neural networks , 1991, [Proceedings] 1991 IEEE International Joint Conference on Neural Networks.

[12]  Robert Hecht-Nielsen,et al.  On the Geometry of Feedforward Neural Network Error Surfaces , 1993, Neural Computation.

[13]  Xiao-Hu Yu,et al.  Can backpropagation error surface not have local minima , 1992, IEEE Trans. Neural Networks.

[14]  Kurt Hornik,et al.  Neural networks and principal component analysis: Learning from examples without local minima , 1989, Neural Networks.