Non-parametric entropy estimators based on simple linear regression
暂无分享,去创建一个
Kensuke Koshijima | Hideitsu Hino | Noboru Murata | N. Murata | H. Hino | Kensuke Koshijima | Noboru Murata
[1] Liam Paninski,et al. Estimation of Entropy and Mutual Information , 2003, Neural Computation.
[2] M. Wand,et al. Multivariate plug-in bandwidth selection , 1994 .
[3] M. Rosenblatt,et al. Multivariate k-nearest neighbor density estimates , 1979 .
[4] Pierre Comon,et al. Independent component analysis, A new concept? , 1994, Signal Process..
[5] Oldrich A Vasicek,et al. A Test for Normality Based on Sample Entropy , 1976 .
[6] Edward C. van der Meulen,et al. Entropy-Based Tests of Uniformity , 1981 .
[7] Ibrahim A. Ahmad,et al. A nonparametric estimation of the entropy for absolutely continuous distributions (Corresp.) , 1976, IEEE Trans. Inf. Theory.
[8] Fernando Pérez-Cruz,et al. Estimation of Information Theoretic Measures for Continuous Random Variables , 2008, NIPS.
[9] L. Györfi,et al. Density-free convergence properties of various estimators of entropy , 1987 .
[10] L. Györfi,et al. Nonparametric entropy estimation. An overview , 1997 .
[11] C. E. SHANNON,et al. A mathematical theory of communication , 1948, MOCO.
[12] M. Rudemo. Empirical Choice of Histograms and Kernel Density Estimators , 1982 .
[13] Dirk P. Kroese,et al. The Cross Entropy Method: A Unified Approach To Combinatorial Optimization, Monte-carlo Simulation (Information Science and Statistics) , 2004 .
[14] C. Tsallis. Possible generalization of Boltzmann-Gibbs statistics , 1988 .
[15] H. Joe. Estimation of entropy and other functionals of a multivariate density , 1989 .
[16] J. Yackel,et al. Consistency Properties of Nearest Neighbor Density Function Estimators , 1977 .
[17] M. C. Jones,et al. A reliable data-based bandwidth selection method for kernel density estimation , 1991 .
[18] Hideitsu Hino,et al. Information estimators for weighted observations , 2013, Neural Networks.
[19] M. N. Goria,et al. A new class of random vector entropy estimators and its applications in testing statistical hypotheses , 2005 .
[20] S. Saigal,et al. Relative performance of mutual information estimation methods for quantifying the dependence among short and noisy data. , 2007, Physical review. E, Statistical, nonlinear, and soft matter physics.
[21] P. Hall. On powerful distributional tests based on sample spacings , 1986 .
[22] E. Oja,et al. Independent Component Analysis , 2013 .
[23] F. P. Tarasenko. On the evaluation of an unknown probability density function, the direct estimation of the entropy from independent observations of a continuous random variable, and the distribution-free entropy test of goodness-of-fit , 1968 .
[24] Shie Mannor,et al. The cross entropy method for classification , 2005, ICML.
[25] A. Bowman. An alternative method of cross-validation for the smoothing of density estimates , 1984 .
[26] Thomas M. Cover,et al. Elements of Information Theory , 2005 .
[27] Dirk P. Kroese,et al. The Cross-Entropy Method: A Unified Approach to Combinatorial Optimization, Monte-Carlo Simulation and Machine Learning , 2004 .
[28] Hideitsu Hino,et al. A Conditional Entropy Minimization Criterion for Dimensionality Reduction and Multiple Kernel Learning , 2010, Neural Computation.
[29] Hideitsu Hino,et al. Entropy-based sliced inverse regression , 2013, Comput. Stat. Data Anal..
[30] John W. Fisher,et al. ICA Using Spacings Estimates of Entropy , 2003, J. Mach. Learn. Res..
[31] C. Quesenberry,et al. A nonparametric estimate of a multivariate density function , 1965 .