Reducing Fitness Evaluations Using Clustering Techniques and Neural Network Ensembles

In many real-world applications of evolutionary computation, it is essential to reduce the number of fitness evaluations. To this end, computationally efficient models can be constructed for fitness evaluations to assist the evolutionary algorithms. When approximate models are involved in evolution, it is very important to determine which individuals should be re-evaluated using the original fitness function to guarantee a faster and correct convergence of the evolutionary algorithm. In this paper, the k-means method is applied to group the individuals of a population into a number of clusters. For each cluster, only the individual that is closest to the cluster center will be evaluated using the expensive original fitness function. The fitness of other individuals are estimated using a neural network ensemble, which is also used to detect possible serious prediction errors. Simulation results from three test functions show that the proposed method exhibits better performance than the strategy where only the best individuals according to the approximate model are re-evaluated.

[1]  Sung-Bae Cho,et al.  An efficient genetic algorithm with less fitness evaluation by clustering , 2001, Proceedings of the 2001 Congress on Evolutionary Computation (IEEE Cat. No.01TH8546).

[2]  Andreas Zell,et al.  Model-Assisted Steady-State Evolution Strategies , 2003, GECCO.

[3]  Anil K. Jain,et al.  Data clustering: a review , 1999, CSUR.

[4]  P. Rousseeuw Silhouettes: a graphical aid to the interpretation and validation of cluster analysis , 1987 .

[5]  Xin Yao,et al.  Evolutionary ensembles with negative correlation learning , 2000, IEEE Trans. Evol. Comput..

[6]  Andy J. Keane,et al.  Metamodeling Techniques For Evolutionary Optimization of Computationally Expensive Problems: Promises and Limitations , 1999, GECCO.

[7]  Bernhard Sendhoff,et al.  Fitness Approximation In Evolutionary Computation - a Survey , 2002, GECCO.

[8]  Thomas Bäck,et al.  Metamodel-Assisted Evolution Strategies , 2002, PPSN.

[9]  Leonardo Vanneschi,et al.  Limiting the Number of Fitness Cases in Genetic Programming Using Statistics , 2002, PPSN.

[10]  Byoung-Tak Zhang,et al.  Building Optimal Committees of Genetic Programs , 2000, PPSN.

[11]  L. Cooper,et al.  When Networks Disagree: Ensemble Methods for Hybrid Neural Networks , 1992 .

[12]  Bernhard Sendhoff,et al.  On Evolutionary Optimization with Approximate Fitness Functions , 2000, GECCO.

[13]  Bruce E. Rosen,et al.  Ensemble Learning Using Decorrelated Neural Networks , 1996, Connect. Sci..

[14]  Alain Ratle,et al.  Accelerating the Convergence of Evolutionary Algorithms by Fitness Landscape Approximation , 1998, PPSN.

[15]  D. Jimenez,et al.  Dynamically weighted ensemble neural networks for classification , 1998, 1998 IEEE International Joint Conference on Neural Networks Proceedings. IEEE World Congress on Computational Intelligence (Cat. No.98CH36227).

[16]  Bernhard Sendhoff,et al.  A framework for evolutionary optimization with approximate fitness functions , 2002, IEEE Trans. Evol. Comput..

[17]  Jürgen Branke,et al.  Faster convergence by means of fitness estimation , 2005, Soft Comput..

[18]  Xin Yao,et al.  Making use of population information in evolutionary artificial neural networks , 1998, IEEE Trans. Syst. Man Cybern. Part B.

[19]  Bernhard Sendhoff,et al.  Neural network regularization and ensembling using multi-objective evolutionary algorithms , 2004, Proceedings of the 2004 Congress on Evolutionary Computation (IEEE Cat. No.04TH8753).

[20]  David W. Opitz,et al.  Generating Accurate and Diverse Members of a Neural-Network Ensemble , 1995, NIPS.