Oscillating iteration paths in neural networks learning

Abstract In this paper we show that finding optimal combinations of learning and momentum rate for the standard backpropagation algorithm used to train neural networks involves difficult trade-offs. Gradient descent can be accelerated with a larger step size and momentum rate, but the stability of the iteration process is affected by certain combinations of parameters. We show in which cases backpropagation produces an oscillatory behavior and how its simple feed-back nature can lead to chaotic iterations. Some graphics illustrate the kind of problems which can be found when applying backpropagation and which are often disregarded.

[1]  Raúl Rojas,et al.  Theorie der neuronalen Netze , 1993 .

[2]  J. D. Schaffer,et al.  Combinations of genetic algorithms and neural networks: a survey of the state of the art , 1992, [Proceedings] COGANN-92: International Workshop on Combinations of Genetic Algorithms and Neural Networks.

[3]  Luís B. Almeida,et al.  SPEEDING-UP BACKPROPAGATION BY DATA ORTHONORMALIZATION† , 1991 .

[4]  Robert A. Jacobs,et al.  Increased rates of convergence through learning rate adaptation , 1987, Neural Networks.

[5]  Paul F. M. J. Verschure,et al.  A note on chaotic behavior in simple neural networks , 1990, Neural Networks.