Hybridisation of evolutionary programming and machine learning with k-nearest neighbor estimation

Evolutionary programming (EP) focus on the search step size which decides the ability of escaping local minima, however does not touch the issue of search in promising region. Estimation of distribution algorithms (EDAs) focus on where the promising region is, however have less consideration about behavior of each individual in solution search algorithms. Since the basic ideas of EP and EDAs are quite different, it is possible to make them reinforce each other. In this paper, we present a hybrid evolutionary framework to make use of both the ideas of EP and EDAs through introducing a mini estimation operator into EP's search cycle. Unlike previous EDAs that use probability density function (PDF), the estimation mechanism used in the proposed framework is the k-nearest neighbor estimation which can perform better with relative small amount of training samples. Our experimental results have shown that the incorporation of machine learning techniques, such as k-nearest neighbor estimation, can improve the performance of evolutionary optimisation algorithms for a large number of benchmark functions.

[1]  D. Fogel Applying evolutionary programming to selected traveling salesman problems , 1993 .

[2]  B. Li,et al.  A novel evolutionary algorithmfor determ ining uni"ed creep damage constitutive equations , 2002 .

[3]  Hans-Paul Schwefel,et al.  Evolution and optimum seeking , 1995, Sixth-generation computer technology series.

[4]  David B. Fogel,et al.  Evolutionary Computation: Towards a New Philosophy of Machine Intelligence , 1995 .

[5]  Marcus Gallagher,et al.  Real-valued Evolutionary Optimization using a Flexible Probability Density Estimator , 1999, GECCO.

[6]  Xin Yao,et al.  Evolutionary programming using mutations based on the Levy probability distribution , 2004, IEEE Transactions on Evolutionary Computation.

[7]  Xin Yao,et al.  Clustering and learning Gaussian distribution for continuous optimization , 2005, IEEE Trans. Syst. Man Cybern. Part C.

[8]  Lawrence J. Fogel,et al.  Artificial Intelligence through Simulated Evolution , 1966 .

[9]  Thomas Bäck,et al.  An Overview of Evolutionary Algorithms for Parameter Optimization , 1993, Evolutionary Computation.

[10]  Xin Yao,et al.  Evolutionary programming made faster , 1999, IEEE Trans. Evol. Comput..

[11]  J. A. Lozano,et al.  Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation , 2001 .

[12]  Thomas Bck,et al.  Evolutionary computation: Toward a new philosophy of machine intelligence , 1997 .

[13]  David G. Stork,et al.  Pattern Classification , 1973 .

[14]  Dirk Thierens,et al.  Advancing continuous IDEAs with mixture distributions and factorization selection metrics , 2001 .

[15]  David E. Goldberg,et al.  Genetic Algorithms, Clustering, and the Breaking of Symmetry , 2000, PPSN.

[16]  E. Parzen On Estimation of a Probability Density Function and Mode , 1962 .

[17]  Yong Lu,et al.  A robust stochastic genetic algorithm (StGA) for global numerical optimization , 2004, IEEE Transactions on Evolutionary Computation.

[18]  David B. Fogel,et al.  System Identification Through Simulated Evolution: A Machine Learning Approach to Modeling , 1991 .

[19]  David B. Fogel,et al.  Tuning Evolutionary Programming for Conformationally Flexible Molecular Docking , 1996, Evolutionary Programming.

[20]  X. Yaoc,et al.  A novel evolutionary algorithm for determining uni " ed creep damage constitutive equations , 2002 .

[21]  Peter E. Hart,et al.  Nearest neighbor pattern classification , 1967, IEEE Trans. Inf. Theory.

[22]  P. Husbands Evolving artificial intelligence , 2001, Trends in Cognitive Sciences.