MEKA-a fast, local algorithm for training feedforward neural networks

It is noted that the training of feedforward networks using the conventional backpropagation algorithm is plagued by poor convergence and misadjustment. The authors introduce the multiple extended Kalman algorithm (MEKA) to train feedforward networks. It is based on the idea of partitioning the global problem of finding the weights into a set of manageable nonlinear subproblems. The algorithm is local at the neuron level. The superiority of MEKA over the global extended Kalman algorithm in terms of convergence and quality of solution obtained on two benchmark problems is demonstrated. The superior performance can be attributed to the nonlinear localized approach. In fact, the nonconvex nature of the local performance surface reduces the chances of getting trapped into a local minima

[1]  B. Anderson,et al.  Optimal Filtering , 1979, IEEE Transactions on Systems, Man, and Cybernetics.

[2]  S. Citrin,et al.  Fast learning process of multi-layer neural nets using recursive least squares technique , 1989, International 1989 Joint Conference on Neural Networks.

[3]  Raymond L. Watrous Learning Algorithms for Connectionist Networks: Applied Gradient Methods of Nonlinear Optimization , 1988 .

[4]  Sharad Singhal,et al.  Training feed-forward networks with the extended Kalman algorithm , 1989, International Conference on Acoustics, Speech, and Signal Processing,.

[5]  PAUL J. WERBOS,et al.  Generalization of backpropagation with application to a recurrent gas market model , 1988, Neural Networks.

[6]  Francesco Palmieri,et al.  A new algorithm for training multilayer perceptrons , 1989, Conference Proceedings., IEEE International Conference on Systems, Man and Cybernetics.

[7]  Geoffrey E. Hinton,et al.  Learning internal representations by error propagation , 1986 .

[8]  Nazif Tepedelenlioglu,et al.  A fast new algorithm for training feedforward neural networks , 1992, IEEE Trans. Signal Process..

[9]  M.R. Azimi-Sadjadi,et al.  Supervised learning process of multi-layer perceptron neural networks using fast least squares , 1990, International Conference on Acoustics, Speech, and Signal Processing.