Learning in multilayered networks used as autoassociators
暂无分享,去创建一个
Gradient descent learning algorithms may get stuck in local minima, thus making the learning suboptimal. In this paper, we focus attention on multilayered networks used as autoassociators and show some relationships with classical linear autoassociators. In addition, by using the theoretical framework of our previous research, we derive a condition which is met at the end of the learning process and show that this condition has a very intriguing geometrical meaning in the pattern space.
[1] Alberto Tesi,et al. On the Problem of Local Minima in Backpropagation , 1992, IEEE Trans. Pattern Anal. Mach. Intell..
[2] Xiao-Hu Yu,et al. Can backpropagation error surface not have local minima , 1992, IEEE Trans. Neural Networks.
[3] Kurt Hornik,et al. Neural networks and principal component analysis: Learning from examples without local minima , 1989, Neural Networks.