Weighting and pruning based ensemble deep random vector functional link network for tabular data classification

In this paper, we first introduce batch normalization to the edRVFL network. This re-normalization method can help the network avoid divergence of the hidden features. Then we propose novel variants of Ensemble Deep Random Vector Functional Link (edRVFL). Weighted edRVFL (WedRVFL) uses weighting methods to give training samples different weights in different layers according to how the samples were classified confidently in the previous layer thereby increasing the ensemble’s diversity and accuracy. Furthermore, a pruning-based edRVFL (PedRVFL) has also been proposed. We prune some inferior neurons based on their importance for classification before generating the next hidden layer. Through this method, we ensure that the randomly generated inferior features will not propagate to deeper layers. Subsequently, the combination of weighting and pruning, called Weighting and Pruning based Ensemble Deep Random Vector Functional Link Network (WPedRVFL), is proposed. We compare their performances with other state-of-the-art deep feedforward neural networks (FNNs) on 24 tabular UCI classification datasets. The experimental results illustrate the superior performance of our proposed methods.

[1]  Yoshua Bengio,et al.  Deep Learning of Representations: Looking Forward , 2013, SLSP.

[2]  J. C. A. Barata,et al.  The Moore–Penrose Pseudoinverse: A Tutorial Review of the Theory , 2011, 1110.6882.

[3]  Guang-Bin Huang,et al.  Extreme Learning Machine for Multilayer Perceptron , 2016, IEEE Transactions on Neural Networks and Learning Systems.

[4]  Gonzalo A. Ruz,et al.  A non-iterative method for pruning hidden neurons in neural networks with random weights , 2018, Appl. Soft Comput..

[5]  Tsuyoshi Murata,et al.  {m , 1934, ACML.

[6]  P. Alam,et al.  R , 1823, The Herodotus Encyclopedia.

[7]  Abraham J. Wyner,et al.  Modern Neural Networks Generalize on Small Data Sets , 2018, NeurIPS.

[8]  Hubert A.B. Te Braake,et al.  Random activation weight neural net (RAWN) for fast non-iterative training. , 1995 .

[9]  H. B. Mann,et al.  On a Test of Whether one of Two Random Variables is Stochastically Larger than the Other , 1947 .

[10]  Najdan Vukovic,et al.  A comprehensive experimental evaluation of orthogonal polynomial expanded random vector functional link neural networks for regression , 2017, Appl. Soft Comput..

[11]  Geoffrey E. Hinton,et al.  Learning sets of filters using back-propagation , 1987 .

[12]  Chuanyi Ji,et al.  Generalizing Smoothness Constraints from Discrete Samples , 1990, Neural Computation.

[13]  Arthur E. Hoerl,et al.  Ridge Regression: Biased Estimation for Nonorthogonal Problems , 2000, Technometrics.

[14]  Jürgen Schmidhuber,et al.  Deep learning in neural networks: An overview , 2014, Neural Networks.

[15]  Bernard Widrow,et al.  The No-Prop algorithm: A new learning algorithm for multilayer neural networks , 2013, Neural Networks.

[16]  C. L. Philip Chen,et al.  Broad Learning System: An Effective and Efficient Incremental Learning System Without the Need for Deep Architecture , 2018, IEEE Transactions on Neural Networks and Learning Systems.

[17]  Eran Segal,et al.  Regularization Learning Networks , 2018, NeurIPS.

[18]  Sepp Hochreiter,et al.  Self-Normalizing Neural Networks , 2017, NIPS.

[19]  Guillermo Sapiro,et al.  Deep Neural Networks with Random Gaussian Weights: A Universal Classification Strategy? , 2015, IEEE Transactions on Signal Processing.

[20]  Ponnuthurai N. Suganthan,et al.  On the origins of randomization-based feedforward neural networks , 2021, Appl. Soft Comput..

[21]  Yves Chauvin,et al.  A Back-Propagation Algorithm with Optimal Use of Hidden Units , 1988, NIPS.

[22]  Danna Zhou,et al.  d. , 1840, Microbial pathogenesis.

[23]  P. Alam ‘L’ , 2021, Composites Engineering: An A–Z Guide.

[24]  P. N. Suganthan,et al.  Random Vector Functional Link Neural Network based Ensemble Deep Learning , 2019, ArXiv.

[25]  Jian Sun,et al.  Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification , 2015, 2015 IEEE International Conference on Computer Vision (ICCV).

[26]  Jürgen Schmidhuber,et al.  Training Very Deep Networks , 2015, NIPS.

[27]  Yiqiang Chen,et al.  Weighted extreme learning machine for imbalance learning , 2013, Neurocomputing.

[28]  Yu-Lin He,et al.  Random weight network-based fuzzy nonlinear regression for trapezoidal fuzzy number data , 2017, Appl. Soft Comput..

[29]  Juan José Rodríguez Diez,et al.  Rotation Forest: A New Classifier Ensemble Method , 2006, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[30]  Babak Hassibi,et al.  Second Order Derivatives for Network Pruning: Optimal Brain Surgeon , 1992, NIPS.

[31]  Y. Takefuji,et al.  Functional-link net computing: theory, system architecture, and functionalities , 1992, Computer.

[32]  Ming Li,et al.  Insights into randomized algorithms for neural networks: Practical issues and common pitfalls , 2017, Inf. Sci..

[33]  P. N. Suganthan,et al.  A comprehensive evaluation of random vector functional link networks , 2016, Inf. Sci..

[34]  Ponnuthurai N. Suganthan,et al.  Random vector functional link network for short-term electricity load demand forecasting , 2016, Inf. Sci..

[35]  Robert P. W. Duin,et al.  Feedforward neural networks with random weights , 1992, Proceedings., 11th IAPR International Conference on Pattern Recognition. Vol.II. Conference B: Pattern Recognition Methodology and Systems.

[36]  Guigang Zhang,et al.  Deep Learning , 2016, Int. J. Semantic Comput..

[37]  Jun Wang,et al.  Forecasting stochastic neural network based on financial empirical mode decomposition , 2017, Neural Networks.

[38]  Aytug Onan,et al.  A multiobjective weighted voting ensemble classifier based on differential evolution algorithm for text sentiment classification , 2016, Expert Syst. Appl..

[39]  Jian Sun,et al.  Deep Residual Learning for Image Recognition , 2015, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[40]  Le Zhang,et al.  Visual Tracking With Convolutional Random Vector Functional Link Network , 2017, IEEE Transactions on Cybernetics.

[41]  Andy Liaw,et al.  Classification and Regression by randomForest , 2007 .

[42]  André Ricardo Backes,et al.  Randomized neural network based descriptors for shape classification , 2018, Neurocomputing.

[43]  Dianhui Wang,et al.  Stochastic Configuration Networks: Fundamentals and Algorithms , 2017, IEEE Transactions on Cybernetics.