Exact Gradient Calculation in Gamma Neural Networks

In this paper we introduce a novel algorithm for computation of the exact error gradient for a gamma network with an arbitrary feedforward topology. An approximate algorithm with reduced computational complexity referred to as Truncated Temporal Backpropagation is also developed. Time series prediction experiments using Mackey-Glass chaotic time series show improved training convergence for our algorithm. The importance of including additional terms in the gradient calculation is demonstrated by illustrating averaged synapse impulse responses for different degrees of truncation and calculating the gradient error in the estimated model introduced by the approximate method.