Dropout and DropConnect based Ensemble of Random Vector Functional Link Neural Network

Ensemble of neural networks are widely studied and applied in machine learning. Ensemble reduces the generalization error by reducing either the bias or variance part of the error or both. For ensembles to perform better, the constituent base classifiers should be diverse and accurate. Diversity can be introduced by injecting randomness in the data or architecture. Random Vector Functional Link (RVFL) neural network with closed form solution is a randomized neural network suitable to use as base classifiers of the ensemble because of its extremely fast training time, good generalization and innate randomization in its architecture. Thus, we propose an ensemble of random vector functional link neural networks by introducing additional regularization or randomization in its architecture via well established regularization techniques in the literature namely Dropout and DropConnect. It is evident that stronger randomization helps ensembles generalize better. Based on the experiments on several datasets, we observe that our proposed ensemble performs better than other RVFL based ensembles in most of the datasets.

[1]  Senén Barro,et al.  Do we need hundreds of classifiers to solve real world classification problems? , 2014, J. Mach. Learn. Res..

[2]  Le Zhang,et al.  A survey of randomized algorithms for training neural networks , 2016, Inf. Sci..

[3]  Lorenzo Porzi,et al.  Dropout distillation , 2016, ICML.

[4]  Thomas G. Dietterich Multiple Classifier Systems , 2000, Lecture Notes in Computer Science.

[5]  Alexandros Iosifidis,et al.  DropELM: Fast neural network regularization with Dropout and DropConnect , 2015, Neurocomputing.

[6]  Ponnuthurai N. Suganthan,et al.  Random vector functional link network for short-term electricity load demand forecasting , 2016, Inf. Sci..

[7]  Yoav Freund,et al.  Boosting a weak learning algorithm by majority , 1990, COLT '90.

[8]  Kilian Q. Weinberger,et al.  Snapshot Ensembles: Train 1, get M for free , 2017, ICLR.

[9]  R. E. Lee,et al.  Distribution-free multiple comparisons between successive treatments , 1995 .

[10]  Ponnuthurai N. Suganthan,et al.  Ensemble incremental learning Random Vector Functional Link network for short-term electric load forecasting , 2018, Knowl. Based Syst..

[11]  P. N. Suganthan,et al.  An Ensemble of Kernel Ridge Regression for Multi-class Classification , 2017, ICCS.

[12]  Gonzalo A. Ruz,et al.  A non-iterative method for pruning hidden neurons in neural networks with random weights , 2018, Appl. Soft Comput..

[13]  P. N. Suganthan,et al.  A comprehensive evaluation of random vector functional link networks , 2016, Inf. Sci..

[14]  Dianhui Wang,et al.  Fast decorrelated neural network ensembles with random weights , 2014, Inf. Sci..

[15]  Leo Breiman,et al.  Bagging Predictors , 1996, Machine Learning.

[16]  Leo Breiman,et al.  Bias, Variance , And Arcing Classifiers , 1996 .

[17]  Ron Kohavi,et al.  Bias Plus Variance Decomposition for Zero-One Loss Functions , 1996, ICML.

[18]  David A. Forsyth,et al.  Swapout: Learning an ensemble of deep architectures , 2016, NIPS.

[19]  Le Zhang,et al.  Visual Tracking With Convolutional Random Vector Functional Link Network , 2017, IEEE Transactions on Cybernetics.

[20]  Serge J. Belongie,et al.  Residual Networks Behave Like Ensembles of Relatively Shallow Networks , 2016, NIPS.

[21]  Yoav Freund,et al.  Boosting the margin: A new explanation for the effectiveness of voting methods , 1997, ICML.

[22]  P. N. Suganthan,et al.  Benchmarking Ensemble Classifiers with Novel Co-Trained Kernal Ridge Regression and Random Vector Functional Link Ensembles [Research Frontier] , 2017, IEEE Computational Intelligence Magazine.

[23]  Ponnuthurai N. Suganthan,et al.  Enhancing Multi-Class Classification of Random Forest using Random Vector Functional Neural Network and Oblique Decision Surfaces , 2018, 2018 International Joint Conference on Neural Networks (IJCNN).

[24]  Yann LeCun,et al.  Regularization of Neural Networks using DropConnect , 2013, ICML.

[25]  Le Zhang,et al.  An ensemble of decision trees with random vector functional link networks for multi-class classification , 2017, Appl. Soft Comput..

[26]  Nitish Srivastava,et al.  Dropout: a simple way to prevent neural networks from overfitting , 2014, J. Mach. Learn. Res..

[27]  Tin Kam Ho,et al.  The Random Subspace Method for Constructing Decision Forests , 1998, IEEE Trans. Pattern Anal. Mach. Intell..

[28]  Janez Demsar,et al.  Statistical Comparisons of Classifiers over Multiple Data Sets , 2006, J. Mach. Learn. Res..

[29]  Pierre Geurts,et al.  Extremely randomized trees , 2006, Machine Learning.