Function approximation and time series prediction with neural networks

Neural networks are examined in the context of function approximation and the related field of time series prediction. A natural extension of radial basis nets is introduced. It is found that use of an adaptable gradient and normalized basis functions can significantly reduce the amount of data necessary to train the net while maintaining the speed advantage of a net that is linear in the weights. The local nature of the network permits the use of simple learning algorithms with short memories of earlier training data. In particular, it is shown that a one-dimensional Newton method is quite fast and reasonably accurate