Conjugate gradient algorithm for efficient training of artifi-cial neural networks

A novel approach is presented for the training of multilayer feedforward neural networks, using a conjugate gradient algorithm. The algorithm updates the unput weights to each neuron in an efficient prallel way, similar to the one used by the well known backpropagation algorithm. The performance of the algorithm is superior to that of the conventional backpropagation algorithm and is based on strong theoretical reasons supported by the numerical results of three examples.