A hybrid algorithm for finding the global minimum of error function of neural networks and its applications

Abstract Back propagation has often been applied to adapt artificial neural networks for various pattern classification problems. However, an important limitation of this method is that it sometimes fails to find a global minimum of the total error function of the neural networks. In this article, a hybrid algorithm that combines the modified back-propagation method and the random optimization method is proposed to find the global minimum of the total error function of a neural network in a small number of steps. It is shown that this hybrid algorithm ensures convergence to a global minimum with probability 1 in a compact region of a weight vector space. Further, the results of several computer simulations dealing with the problems of forecasting air pollution density, forecasting stock prices, and determining the octane rating in gasoline blending are given.