A new algorithm for training multilayer perceptrons

A fast version of the packpropagation algorithm based on the recursive least squares (RLS) technique is introduced. The added storage requirement and computation associated with RLS can be easily incorporated into the network architecture, and consequently the algorithm is still local. The enhanced algorithm performs consistently better than the backpropagation algorithm in a set of simulations involving two benchmark problems.<<ETX>>