Dropout and DropConnect based Ensemble of Random Vector Functional Link Neural Network
暂无分享,去创建一个
[1] Senén Barro,et al. Do we need hundreds of classifiers to solve real world classification problems? , 2014, J. Mach. Learn. Res..
[2] Le Zhang,et al. A survey of randomized algorithms for training neural networks , 2016, Inf. Sci..
[3] Lorenzo Porzi,et al. Dropout distillation , 2016, ICML.
[4] Thomas G. Dietterich. Multiple Classifier Systems , 2000, Lecture Notes in Computer Science.
[5] Alexandros Iosifidis,et al. DropELM: Fast neural network regularization with Dropout and DropConnect , 2015, Neurocomputing.
[6] Ponnuthurai N. Suganthan,et al. Random vector functional link network for short-term electricity load demand forecasting , 2016, Inf. Sci..
[7] Yoav Freund,et al. Boosting a weak learning algorithm by majority , 1990, COLT '90.
[8] Kilian Q. Weinberger,et al. Snapshot Ensembles: Train 1, get M for free , 2017, ICLR.
[9] R. E. Lee,et al. Distribution-free multiple comparisons between successive treatments , 1995 .
[10] Ponnuthurai N. Suganthan,et al. Ensemble incremental learning Random Vector Functional Link network for short-term electric load forecasting , 2018, Knowl. Based Syst..
[11] P. N. Suganthan,et al. An Ensemble of Kernel Ridge Regression for Multi-class Classification , 2017, ICCS.
[12] Gonzalo A. Ruz,et al. A non-iterative method for pruning hidden neurons in neural networks with random weights , 2018, Appl. Soft Comput..
[13] P. N. Suganthan,et al. A comprehensive evaluation of random vector functional link networks , 2016, Inf. Sci..
[14] Dianhui Wang,et al. Fast decorrelated neural network ensembles with random weights , 2014, Inf. Sci..
[15] Leo Breiman,et al. Bagging Predictors , 1996, Machine Learning.
[16] Leo Breiman,et al. Bias, Variance , And Arcing Classifiers , 1996 .
[17] Ron Kohavi,et al. Bias Plus Variance Decomposition for Zero-One Loss Functions , 1996, ICML.
[18] David A. Forsyth,et al. Swapout: Learning an ensemble of deep architectures , 2016, NIPS.
[19] Le Zhang,et al. Visual Tracking With Convolutional Random Vector Functional Link Network , 2017, IEEE Transactions on Cybernetics.
[20] Serge J. Belongie,et al. Residual Networks Behave Like Ensembles of Relatively Shallow Networks , 2016, NIPS.
[21] Yoav Freund,et al. Boosting the margin: A new explanation for the effectiveness of voting methods , 1997, ICML.
[22] P. N. Suganthan,et al. Benchmarking Ensemble Classifiers with Novel Co-Trained Kernal Ridge Regression and Random Vector Functional Link Ensembles [Research Frontier] , 2017, IEEE Computational Intelligence Magazine.
[23] Ponnuthurai N. Suganthan,et al. Enhancing Multi-Class Classification of Random Forest using Random Vector Functional Neural Network and Oblique Decision Surfaces , 2018, 2018 International Joint Conference on Neural Networks (IJCNN).
[24] Yann LeCun,et al. Regularization of Neural Networks using DropConnect , 2013, ICML.
[25] Le Zhang,et al. An ensemble of decision trees with random vector functional link networks for multi-class classification , 2017, Appl. Soft Comput..
[26] Nitish Srivastava,et al. Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..
[27] Tin Kam Ho,et al. The Random Subspace Method for Constructing Decision Forests , 1998, IEEE Trans. Pattern Anal. Mach. Intell..
[28] Janez Demsar,et al. Statistical Comparisons of Classifiers over Multiple Data Sets , 2006, J. Mach. Learn. Res..
[29] Pierre Geurts,et al. Extremely randomized trees , 2006, Machine Learning.