Another Hybrid Algorithm for Nding a Global Mimimum of Mlp Error Functions

This report presents P scg , a new global optimization method for training mul-tilayered perceptrons. Instead of local minima, global minima of the error function are found. This new method is hybrid in the sense that it combines three very diierent optimization techniques: Random Line Search, Scaled Conjugate Gradient and a 1-dimensional minimization algorithm named P. The best points of each component are retained by the hybrid method: simplicity of Random Line Search, eeciency of Scaled Conjugate Gradient, eeciency and convergence toward a global minimum for P. P scg is empirically shown to perform better or much better than three other global random optimization methods and a global deterministic optimization method. The aim of this research is to provide easy-to-use learning methods for several research projects; in particular these methods will be employed by knowledge-based systems. scg variants is more accurate now.