Mirrored Sampling and Sequential Selection for Evolution Strategies

This paper reveals the surprising result that a single-parent non-elitist evolution strategy (ES) can be locally faster than the (1+1)-ES. The result is brought about by mirrored sampling and sequential selection.With mirrored sampling, two offspring are generated symmetrically or mirrored with respect to their parent. In sequential selection, the offspring are evaluated sequentially and the iteration is concluded as soon as one offspring is better than the current parent. Both concepts complement each other well.We derive exact convergence rates of the (1, λ)-ES with mirrored sampling and/or sequential selection on the sphere model. The log-linear convergence of the ES is preserved. Both methods lead to an improvement and in combination the (1,4)-ES becomes about 10% faster than the (1+1)-ES. Naively implemented into the CMA-ES with recombination, mirrored sampling leads to a bias on the step-size. However, the (1,4)-CMA-ES with mirrored sampling and sequential selection is unbiased and appears to be faster, more robust, and as local as the (1+1)-CMA-ES.

[1]  J. Hammersley,et al.  Monte Carlo Methods , 1966 .

[2]  Nikolaus Hansen,et al.  Completely Derandomized Self-Adaptation in Evolution Strategies , 2001, Evolutionary Computation.

[3]  Petros Koumoutsakos,et al.  Reducing the Time Complexity of the Derandomized Evolution Strategy with Covariance Matrix Adaptation (CMA-ES) , 2003, Evolutionary Computation.

[4]  Julian F. Miller,et al.  Genetic and Evolutionary Computation — GECCO 2003 , 2003, Lecture Notes in Computer Science.

[5]  Nikolaus Hansen,et al.  Evaluating the CMA Evolution Strategy on Multimodal Test Functions , 2004, PPSN.

[6]  Olivier Teytaud,et al.  On the Ultimate Convergence Rates for Isotropic Algorithms and the Best Choices Among Various Forms of Isotropy , 2006, PPSN.

[7]  Peter A. N. Bosman,et al.  Proceedings of the Genetic and Evolutionary Computation Conference - GECCO - 2006 , 2006 .

[8]  Nikolaus Hansen,et al.  The CMA Evolution Strategy: A Comparing Review , 2006, Towards a New Evolutionary Computation.

[9]  Nikolaus Hansen,et al.  An Analysis of Mutative -Self-Adaptation on Linear Fitness Functions , 2006, Evolutionary Computation.

[10]  Edmund K. Burke,et al.  Parallel Problem Solving from Nature - PPSN IX: 9th International Conference, Reykjavik, Iceland, September 9-13, 2006, Proceedings , 2006, PPSN.

[11]  Christian Igel,et al.  A computational efficient covariance matrix update and a (1+1)-CMA for evolution strategies , 2006, GECCO.

[12]  Pedro Larrañaga,et al.  Towards a New Evolutionary Computation - Advances in the Estimation of Distribution Algorithms , 2006, Towards a New Evolutionary Computation.

[13]  Anne Auger,et al.  Reconsidering the progress rate theory for evolution strategies in finite dimensions , 2006, GECCO '06.

[14]  Anne Auger,et al.  Log-Linear Convergence and Optimal Bounds for the (1+1)-ES , 2007, Artificial Evolution.

[15]  Dirk V. Arnold,et al.  Evolutionary Gradient Search Revisited , 2007, IEEE Transactions on Evolutionary Computation.

[16]  Olivier Teytaud,et al.  DCMA: yet another derandomization in covariance-matrix-adaptation , 2007, GECCO '07.

[17]  Dirk V. Arnold,et al.  Cumulative Step Length Adaptation for Evolution Strategies Using Negative Recombination Weights , 2008, EvoWorkshops.

[18]  Rolf Drechsler,et al.  Applications of Evolutionary Computing, EvoWorkshops 2008: EvoCOMNET, EvoFIN, EvoHOT, EvoIASP, EvoMUSART, EvoNUM, EvoSTOC, and EvoTransLog, Naples, Italy, March 26-28, 2008. Proceedings , 2008, EvoWorkshops.

[19]  Petros Koumoutsakos,et al.  A Method for Handling Uncertainty in Evolutionary Optimization With an Application to Feedback Control of Combustion , 2009, IEEE Transactions on Evolutionary Computation.

[20]  Anne Auger,et al.  Benchmarking the (1+1)-CMA-ES on the BBOB-2009 noisy testbed , 2009, GECCO '09.

[21]  Anne Auger,et al.  Real-Parameter Black-Box Optimization Benchmarking 2009: Noiseless Functions Definitions , 2009 .

[22]  Anne Auger,et al.  Benchmarking the (1+1)-CMA-ES on the BBOB-2009 function testbed , 2009, GECCO '09.

[23]  Nikolaus Hansen,et al.  Benchmarking a BI-population CMA-ES on the BBOB-2009 function testbed , 2009, GECCO '09.

[24]  N. Hansen,et al.  Investigating the impact of sequential selection in the (1,4)-CMA-ES on the noisy BBOB-2010 testbed , 2010, GECCO '10.

[25]  N. Hansen,et al.  Investigating the impact of sequential selection in the (1,4)-CMA-ES on the noiseless BBOB-2010 testbed , 2010, GECCO '10.

[26]  Anne Auger,et al.  Mirrored variants of the (1,2)-CMA-ES compared on the noiseless BBOB-2010 testbed , 2010, GECCO '10.

[27]  Anne Auger,et al.  Investigating the impact of sequential selection in the (1,2)-CMA-ES on the noisy BBOB-2010 testbed , 2010, GECCO '10.

[28]  Anne Auger,et al.  Mirrored variants of the (1,4)-CMA-ES compared on the noisy BBOB-2010 testbed , 2010, GECCO '10.

[29]  Anne Auger,et al.  Mirrored variants of the (1,2)-CMA-ES compared on the noisy BBOB-2010 testbed , 2010, GECCO '10.

[30]  Anne Auger,et al.  Mirrored variants of the (1,4)-CMA-ES compared on the noiseless BBOB-2010 testbed , 2010, GECCO '10.

[31]  Anne Auger,et al.  Investigating the impact of sequential selection in the (1,2)-CMA-ES on the noiseless BBOB-2010 testbed , 2010, GECCO '10.

[32]  Edmund K. Burke,et al.  The Genetic and Evolutionary Computation Conference , 2011 .

[33]  Anne Auger,et al.  Theory of Evolution Strategies: A New Perspective , 2011, Theory of Randomized Search Heuristics.