Predictive Ensemble Pruning by Expectation Propagation
暂无分享,去创建一个
Huanhuan Chen | Peter Tiño | Xin Yao | X. Yao | P. Tiňo | Huanhuan Chen
[1] Juan José Rodríguez Diez,et al. Rotation Forest: A New Classifier Ensemble Method , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.
[2] George Eastman House,et al. Sparse Bayesian Learning and the Relevance Vector Machine , 2001 .
[3] R. Tibshirani,et al. Combining Estimates in Regression and Classification , 1996 .
[4] Johannes R. Sveinsson,et al. Parallel consensual neural networks , 1997, IEEE Trans. Neural Networks.
[5] Nitesh V. Chawla,et al. Learning Ensembles from Bites: A Scalable and Accurate Approach , 2004, J. Mach. Learn. Res..
[6] Wei Tang,et al. Ensembling neural networks: Many could be better than all , 2002, Artif. Intell..
[7] Naonori Ueda,et al. Optimal Linear Combination of Neural Networks for Improving Classification Performance , 2000, IEEE Trans. Pattern Anal. Mach. Intell..
[8] Huanhuan Chen,et al. A Probabilistic Ensemble Pruning Algorithm , 2006, Sixth IEEE International Conference on Data Mining - Workshops (ICDMW'06).
[9] Yuan Qi,et al. Predictive automatic relevance determination by expectation propagation , 2004, ICML.
[10] Nando de Freitas,et al. An Introduction to MCMC for Machine Learning , 2004, Machine Learning.
[11] Leo Breiman,et al. Stacked regressions , 2004, Machine Learning.
[12] Leo Breiman,et al. Bagging Predictors , 1996, Machine Learning.
[13] Xin Yao,et al. Ensemble learning via negative correlation , 1999, Neural Networks.
[14] Tom Minka,et al. Expectation Propagation for approximate Bayesian inference , 2001, UAI.
[15] Filippo Menczer,et al. Meta-evolutionary ensembles , 2002, Proceedings of the 2002 International Joint Conference on Neural Networks. IJCNN'02 (Cat. No.02CH37290).
[16] R. Iman,et al. Approximations of the critical region of the fbietkan statistic , 1980 .
[17] Oleksandr Makeyev,et al. Neural network with ensembles , 2010, The 2010 International Joint Conference on Neural Networks (IJCNN).
[18] L. Breiman. Arcing classifier (with discussion and a rejoinder by the author) , 1998 .
[19] ZhouZhi-Hua,et al. Ensembling neural networks , 2002 .
[20] Thomas Richardson,et al. Boosting methodology for regression problems , 1999, AISTATS.
[21] O. J. Dunn. Multiple Comparisons among Means , 1961 .
[22] J. M. Bates,et al. The Combination of Forecasts , 1969 .
[23] G DietterichThomas. An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees , 2000 .
[24] Janez Demsar,et al. Statistical Comparisons of Classifiers over Multiple Data Sets , 2006, J. Mach. Learn. Res..
[25] Sherif Hashem,et al. Optimal Linear Combinations of Neural Networks , 1997, Neural Networks.
[26] Robert E. Schapire,et al. A Brief Introduction to Boosting , 1999, IJCAI.
[27] Michael E. Tipping,et al. Analysis of Sparse Bayesian Learning , 2001, NIPS.
[28] Xin Yao,et al. Making use of population information in evolutionary artificial neural networks , 1998, IEEE Trans. Syst. Man Cybern. Part B.
[29] Daoqiang Zhang,et al. Constraint Projections for Ensemble Learning , 2008, AAAI.
[30] Thomas G. Dietterich,et al. Pruning Adaptive Boosting , 1997, ICML.
[31] Philip K. Chan,et al. Meta-learning in distributed data mining systems: Issues and approaches , 2007 .
[32] Lawrence O. Hall,et al. Ensemble diversity measures and their application to thinning , 2004, Inf. Fusion.
[33] Leo Breiman,et al. Random Forests , 2001, Machine Learning.
[34] William Nick Street,et al. Ensemble Pruning Via Semi-definite Programming , 2006, J. Mach. Learn. Res..
[35] Xin Yao,et al. A constructive algorithm for training cooperative neural network ensembles , 2003, IEEE Trans. Neural Networks.
[36] D. Opitz,et al. Popular Ensemble Methods: An Empirical Study , 1999, J. Artif. Intell. Res..
[37] Ayhan Demiriz,et al. Linear Programming Boosting via Column Generation , 2002, Machine Learning.
[38] M. Friedman. The Use of Ranks to Avoid the Assumption of Normality Implicit in the Analysis of Variance , 1937 .
[39] Thomas G. Dietterich. An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization , 2000, Machine Learning.