The minimax distortion redundancy in empirical quantizer design
暂无分享,去创建一个
[1] A E Bostwick,et al. THE THEORY OF PROBABILITIES. , 1896, Science.
[2] A. Kolmogorov,et al. Entropy and "-capacity of sets in func-tional spaces , 1961 .
[3] Michel Loève,et al. Probability Theory I , 1977 .
[4] C. L. Mallows. An inequality involving multinomial probabilities , 1968 .
[5] Shun-ichi Amari,et al. A Theory of Pattern Recognition , 1968 .
[6] Lee D. Davisson,et al. Universal noiseless coding , 1973, IEEE Trans. Inf. Theory.
[7] David L. Neuhoff,et al. Fixed rate universal block source coding with a fidelity criterion , 1975, IEEE Trans. Inf. Theory.
[8] H. Teicher,et al. Probability theory: Independence, interchangeability, martingales , 1978 .
[9] Robert M. Gray,et al. An Algorithm for Vector Quantizer Design , 1980, IEEE Trans. Commun..
[10] Robert M. Gray,et al. Locally Optimal Block Quantizer Design , 1980, Inf. Control..
[11] D. Pollard. Strong Consistency of $K$-Means Clustering , 1981 .
[12] D. Pollard. A Central Limit Theorem for $k$-Means Clustering , 1982 .
[13] David Pollard,et al. Quantization and the method of k -means , 1982, IEEE Trans. Inf. Theory.
[14] K. Alexander,et al. Probability Inequalities for Empirical Processes and a Law of the Iterated Logarithm , 1984 .
[15] Geoffrey C. Fox,et al. Vector quantization by deterministic annealing , 1992, IEEE Trans. Inf. Theory.
[16] Allen Gersho,et al. Competitive learning and soft competition for vector quantizer design , 1992, IEEE Trans. Signal Process..
[17] P. Chou. The distortion of vector quantizers trained on n vectors decreases to the optimum as O/sub p/(1/n) , 1994, Proceedings of 1994 IEEE International Symposium on Information Theory.
[18] Tamás Linder,et al. Rates of convergence in the source coding theorem, in empirical quantizer design, and in universal lossy source coding , 1994, IEEE Trans. Inf. Theory.
[19] M. Talagrand. Sharper Bounds for Gaussian and Empirical Processes , 1994 .
[20] P. Shields. When is the weak rate equal to the strong rate? , 1994, Proceedings of 1994 Workshop on Information Theory and Statistics.
[21] László Györfi,et al. A Probabilistic Theory of Pattern Recognition , 1996, Stochastic Modelling and Applied Probability.
[22] Neri Merhav,et al. On the amount of statistical side information required for lossy data compression , 1997, IEEE Trans. Inf. Theory.
[23] Tamás Linder,et al. Empirical quantizer design in the presence of source noise or channel noise , 1997, IEEE Trans. Inf. Theory.