A generalisation of independence in statistical models for categorical distribution
暂无分享,去创建一个
[1] A. Agresti,et al. Categorical Data Analysis , 1991, International Encyclopedia of Statistical Science.
[2] S. Amari. Integration of Stochastic Models by Minimizing -Divergence , 2007, Neural Computation.
[3] Shinto Eguchi,et al. Robust estimation in the normal mixture model , 2006 .
[4] A. Asuncion,et al. UCI Machine Learning Repository, University of California, Irvine, School of Information and Computer Sciences , 2007 .
[5] Noboru Murata,et al. A Generalization of Independence in Naive Bayes Model , 2010, IDEAL.
[6] Yu Fujimoto,et al. A modified EM algorithm for mixture models based on Bregman divergence , 2007 .
[7] Daniel Kahneman,et al. Probabilistic reasoning , 1993 .
[8] Judea Pearl,et al. Probabilistic reasoning in intelligent systems , 1988 .
[9] Takafumi Kanamori,et al. Robust Boosting Algorithm Against Mislabeling in Multiclass Problems , 2008, Neural Computation.
[10] D. Rubin. The Bayesian Bootstrap , 1981 .
[11] R. Nelsen. An Introduction to Copulas , 1998 .
[12] Finn V. Jensen,et al. Bayesian Networks and Decision Graphs , 2001, Statistics for Engineering and Information Science.
[13] Pedro M. Domingos,et al. On the Optimality of the Simple Bayesian Classifier under Zero-One Loss , 1997, Machine Learning.
[14] Takafumi Kanamori,et al. Information Geometry of U-Boost and Bregman Divergence , 2004, Neural Computation.
[15] Murata Noboru,et al. A generalized product rule and weak independence based on Bregman divergence , 2008 .
[16] Thomas Hofmann,et al. Unsupervised Learning by Probabilistic Latent Semantic Analysis , 2004, Machine Learning.
[17] Bregman divergence and density integration , 2009 .