Acceleration of backpropagation learning using optimised learning rate and momentum

Learning rate and momentum factor are two arbitrary parameters that have to be carefully chosen in the conventional backpropagation (BP) learning algorithm. Based on a linear expansion of the actual outputs of the BP network with respect to the two parameters, the Letter presents an efficient approach to determine the dynamically optimal values of these two parameters. Simulation results indicate that the present approach can provide a remarkable improvement in convergence performance.

[1]  D.R. Hush,et al.  Progress in supervised neural networks , 1993, IEEE Signal Processing Magazine.

[2]  Geoffrey E. Hinton,et al.  Learning representations by back-propagating errors , 1986, Nature.