Ensemble structure of evolutionary artificial neural networks

Evolutionary artificial neural networks (EANNs) refer to a special class of artificial neural networks (ANNs) in which evolution is another fundamental form of adaptation in addition to learning. Evolution can be introduced at various levels of ANNs. It can be used to evolve weights, architectures, and learning parameters and rules. The paper is concerned with the evolution of ANN architectures, where an evolutionary algorithm is used to evolve a population of ANNs. The current practice in evolving ANNs is to choose the best ANN in the last population as the final result. The paper proposes a novel approach to form the final result by combining all the individuals in the last generation in order to make best use of all the information contained in the whole population. This approach regards a population of ANNs as an ensemble of ANNs and use a method to combine them. We have used four simple methods in our computational studies. The first is the majority voting method. The second and third are linear combination methods over the whole population. The fourth is a linear combination method over a subset of the whole population. The near optimal subset is obtained by a genetic algorithm search. Our experiments have shown that all four methods have produced better results than those produced by the single best individual.

[1]  M. V. Rossum,et al.  In Neural Computation , 2022 .

[2]  M. Perrone Improving regression estimation: Averaging methods for variance reduction with extensions to general convex measure optimization , 1993 .

[3]  D. Signorini,et al.  Neural networks , 1995, The Lancet.

[4]  J. D. Schaffer,et al.  Combinations of genetic algorithms and neural networks: a survey of the state of the art , 1992, [Proceedings] COGANN-92: International Workshop on Combinations of Genetic Algorithms and Neural Networks.

[5]  Françoise Fogelman-Soulié,et al.  Multi-Modular Neural Network Architectures: Applications in Optical Character and Human Face Recognition , 1993, Int. J. Pattern Recognit. Artif. Intell..

[6]  Sherif Hashem,et al.  Optimal Linear Combinations of Neural Networks , 1997, Neural Networks.

[7]  Robert G. Reynolds,et al.  Evolutionary computation: Towards a new philosophy of machine intelligence , 1997 .

[8]  Xin Yao,et al.  Automatic modularization by speciation , 1996, Proceedings of IEEE International Conference on Evolutionary Computation.

[9]  David E. Goldberg,et al.  Genetic Algorithms in Search Optimization and Machine Learning , 1988 .

[10]  Xin YaoComputational A Population-Based Learning Algorithm Which Learns BothArchitectures and Weights of Neural Networks , 1996 .

[11]  C. Cowan,et al.  Adaptive Filters and Equalisers , 1988 .

[12]  L. Cooper,et al.  When Networks Disagree: Ensemble Methods for Hybrid Neural Networks , 1992 .

[13]  Asim Roy,et al.  An algorithm to generate radial basis function (RBF)-like nets for classification problems , 1995, Neural Networks.

[14]  Bill Fulkerson,et al.  Machine Learning, Neural and Statistical Classification , 1995 .

[15]  Harris Drucker,et al.  Boosting and Other Ensemble Methods , 1994, Neural Computation.

[16]  Harris Drucker,et al.  Boosting Performance in Neural Networks , 1993, Int. J. Pattern Recognit. Artif. Intell..

[17]  Xin Yao,et al.  Evolutionary Artificial Neural Networks , 1993, Int. J. Neural Syst..

[18]  H. Schwarze,et al.  Generalization in Fully Connected Committee Machines , 1993 .

[19]  Lawrence J. Fogel,et al.  Artificial Intelligence through Simulated Evolution , 1966 .

[20]  Xin Yao,et al.  Evolutionary Artificial Neural Networks That Learn and Generalise Well , 1996 .

[21]  Alan F. Murray,et al.  IEEE International Conference on Neural Networks , 1997 .

[22]  Xin Yao,et al.  An empirical study of genetic operators in genetic algorithms , 1993, Microprocess. Microprogramming.

[23]  Xin Yao,et al.  A new evolutionary system for evolving artificial neural networks , 1997, IEEE Trans. Neural Networks.