Nature Inspired Neural Network Ensemble Learning

[1]  Xin Yao,et al.  Evolving Neural Network Ensembles by Minimization of Mutual Information , 2004, Int. J. Hybrid Intell. Syst..

[2]  R. Schapire The Strength of Weak Learnability , 1990, Machine Learning.

[3]  Xin Yao,et al.  Making use of population information in evolutionary artificial neural networks , 1998, IEEE Trans. Syst. Man Cybern. Part B.

[4]  Robert A. Jacobs,et al.  Bias/Variance Analyses of Mixtures-of-Experts Architectures , 1997, Neural Computation.

[5]  Amanda J. C. Sharkey,et al.  On Combining Artificial Neural Nets , 1996, Connect. Sci..

[6]  David W. Opitz,et al.  Actively Searching for an Effective Neural Network Ensemble , 1996, Connect. Sci..

[7]  Bruce E. Rosen,et al.  Ensemble Learning Using Decorrelated Neural Networks , 1996, Connect. Sci..

[8]  Nathan Intrator,et al.  Bootstrapping with Noise: An Effective Regularization Technique , 1996, Connect. Sci..

[9]  Harris Drucker,et al.  Boosting and Other Ensemble Methods , 1994, Neural Computation.

[10]  Galina L. Rogova,et al.  Combining the results of several neural network classifiers , 1994, Neural Networks.

[11]  Roberto Battiti,et al.  Democracy in neural nets: Voting schemes for classification , 1994, Neural Networks.

[12]  Robert A. Jacobs,et al.  Hierarchical Mixtures of Experts and the EM Algorithm , 1993, Neural Computation.

[13]  Geoffrey E. Hinton,et al.  Adaptive Mixtures of Local Experts , 1991, Neural Computation.

[14]  Robert L. Winkler,et al.  Limits for the Precision and Value of Information from Dependent Sources , 1985, Oper. Res..

[15]  J. Rissanen,et al.  Modeling By Shortest Data Description* , 1978, Autom..