A new algorithm for training multilayer perceptrons
暂无分享,去创建一个
A fast version of the packpropagation algorithm based on the recursive least squares (RLS) technique is introduced. The added storage requirement and computation associated with RLS can be easily incorporated into the network architecture, and consequently the algorithm is still local. The enhanced algorithm performs consistently better than the backpropagation algorithm in a set of simulations involving two benchmark problems.<<ETX>>
[1] S. Citrin,et al. Fast learning process of multi-layer neural nets using recursive least squares technique , 1989, International 1989 Joint Conference on Neural Networks.
[2] Geoffrey E. Hinton,et al. Learning internal representations by error propagation , 1986 .
[3] Robert A. Jacobs,et al. Increased rates of convergence through learning rate adaptation , 1987, Neural Networks.
[4] S. Haykin,et al. Adaptive Filter Theory , 1986 .