Estimating the Information Potential with the Fast Gauss Transform

In this paper, we propose a fast and accurate approximation to the information potential of Information Theoretic Learning (ITL) using the Fast Gauss Transform (FGT). We exemplify here the case of the Minimum Error Entropy criterion to train adaptive systems. The FGT reduces the complexity of the estimation from O(N2) to O(pkN) wherep is the order of the Hermite approximation and k the number of clusters utilized in FGT. Further, we show that FGT converges to the actual entropy value rapidly with increasing order p unlike the Stochastic Information Gradient, the present O(pN) approximation to reduce the computational complexity in ITL. We test the performance of these FGT methods on System Identification with encouraging results.

[1]  Leslie Greengard,et al.  The Fast Gauss Transform , 1991, SIAM J. Sci. Comput..

[2]  Deniz Erdogmus,et al.  An error-entropy minimization algorithm for supervised training of nonlinear adaptive systems , 2002, IEEE Trans. Signal Process..

[3]  Deniz Erdogmus,et al.  Stochastic blind equalization based on PDF fitting using Parzen estimator , 2005, IEEE Transactions on Signal Processing.

[4]  J. Príncipe,et al.  Information-Theoretic Learning Using Renyi's Quadratic Entropy , 1999 .

[5]  Deniz Erdoğmuş,et al.  Blind source separation using Renyi's mutual information , 2001, IEEE Signal Processing Letters.

[6]  Teofilo F. GONZALEZ,et al.  Clustering to Minimize the Maximum Intercluster Distance , 1985, Theor. Comput. Sci..

[7]  Kari Torkkola,et al.  Learning Discriminative Feature Transforms to Low Dimensions in Low Dimentions , 2001, NIPS.

[8]  Larry S. Davis,et al.  Improved fast gauss transform and efficient kernel density estimation , 2003, Proceedings Ninth IEEE International Conference on Computer Vision.

[9]  Deniz Erdoğmuş,et al.  Online entropy manipulation: stochastic information gradient , 2003, IEEE Signal Processing Letters.

[10]  J.C. Principe,et al.  Information theoretic spectral clustering , 2004, 2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No.04CH37541).