The fractal geometry of backpropagation

It is empirically known that standard backpropagation is very sensitive to the initial learning rate chosen for a given learning task. In this paper the author examines the shape of the iteration path for the training of a linear associator using backpropagation with momentum and online backpropagation. In the first case the path resembles Lisajous figures, in the second it is a fractal in weight space. The specific form depends on the learning rate chosen, but there is a threshold value for which the attractor of the iteration path is dense in a region of weight space around a local minimum of the error function. This result also yields a deeper insight into the mechanics of the iteration process in the nonlinear case.<<ETX>>

[1]  Paul F. M. J. Verschure,et al.  A note on chaotic behavior in simple neural networks , 1990, Neural Networks.

[2]  Raúl Rojas,et al.  Theorie der neuronalen Netze , 1993 .

[3]  Raúl Rojas,et al.  Speeding-up backpropagation-a comparison of orthogonal techniques , 1993, Proceedings of 1993 International Conference on Neural Networks (IJCNN-93-Nagoya, Japan).

[4]  Michael F. Barnsley,et al.  Fractals everywhere , 1988 .

[5]  J. Stephen Judd,et al.  Neural network design and the complexity of learning , 1990, Neural network modeling and connectionism.