Exploiting gradient information in continuous iterated density estimation evolutionary algorithms

For continuous optimization problems, evolutionary algorithms (EAs) that build and use probabilistic models have obtained promising results. However, the local gradient information of the fitness function is not used in these EAs. In the case of optimization of continuous differentiable functions, it may be less efficient to disregard this information. In this paper, we therefore hybridize pure continuous iterated density estimation evolutionary algorithms (IDEAs) by using the conjugate gradient algorithm on a selection of the solutions. We test the resulting algorithm on a few well known difficult continuous differentiable function optimization problems. The results indicate that exploiting gradient information in probabilistic model building EAs leads to more efficient continuous optimization

[1]  Jeff A. Bilmes,et al.  A gentle tutorial of the em algorithm and its application to parameter estimation for Gaussian mixture and hidden Markov models , 1998 .

[2]  David E. Goldberg,et al.  Bayesian optimization algorithm, decision graphs, and Occam's razor , 2001 .

[3]  David E. Goldberg,et al.  A Survey of Optimization by Building and Using Probabilistic Models , 2002, Comput. Optim. Appl..

[4]  P. Bosman,et al.  Negative log-likelihood and statistical hypothesis testing as the basis of model selection in IDEAs , 2000 .

[5]  Heinz Mühlenbein,et al.  FDA -A Scalable Evolutionary Algorithm for the Optimization of Additively Decomposed Functions , 1999, Evolutionary Computation.

[6]  Pedro Larrañaga,et al.  Experimental Results in Function Optimization with EDAs in Continuous Domain , 2002, Estimation of Distribution Algorithms.

[7]  Marcus Gallagher,et al.  Real-valued Evolutionary Optimization using a Flexible Probability Density Estimator , 1999, GECCO.

[8]  D. Goldberg,et al.  Linkage learning through probabilistic expression , 2000 .

[9]  William H. Press,et al.  Numerical Recipes in Fortran 77: The Art of Scientific Computing 2nd Editionn - Volume 1 of Fortran Numerical Recipes , 1992 .

[10]  M. Hestenes,et al.  Methods of conjugate gradients for solving linear systems , 1952 .

[11]  Dirk Thierens,et al.  Expanding from Discrete to Continuous Estimation of Distribution Algorithms: The IDEA , 2000, PPSN.

[12]  Heinz Mühlenbein,et al.  A Factorized Distribution Algorithm Using Single Connected Bayesian Networks , 2000, PPSN.

[13]  Dirk Thierens,et al.  Advancing continuous IDEAs with mixture distributions and factorization selection metrics , 2001 .