Simplify Your Covariance Matrix Adaptation Evolution Strategy

The standard covariance matrix adaptation evolution strategy (CMA-ES) comprises two evolution paths, one for the learning of the mutation strength and one for the rank-1 update of the covariance matrix. In this paper, it is shown that one can approximately transform this algorithm in such a manner that one of the evolution paths and the covariance matrix itself disappear. That is, the covariance update and the covariance matrix square root operations are no longer needed in this novel so-called matrix adaptation (MA) ES. The MA-ES performs nearly as well as the original CMA-ES. This is shown by empirical investigations considering the evolution dynamics and the empirical expected runtime on a set of standard test functions. Furthermore, it is shown that the MA-ES can be used as a search engine in a bi-population (BiPop) ES. The resulting BiPop-MA-ES is benchmarked using the BBOB comparing continuous optimizers (COCO) framework and compared with the performance of the CMA-ES-v3.61 production code. It is shown that this new BiPop-MA-ES—while algorithmically simpler—performs nearly equally well as the CMA-ES-v3.61 code.

[1]  Hans-Georg Beyer,et al.  Performance analysis of evolutionary optimization with cumulative step length adaptation , 2004, IEEE Transactions on Automatic Control.

[2]  Petros Koumoutsakos,et al.  Reducing the Time Complexity of the Derandomized Evolution Strategy with Covariance Matrix Adaptation (CMA-ES) , 2003, Evolutionary Computation.

[3]  Silja Meyer-Nieberg,et al.  A New Look at the Covariance Matrix Estimation in Evolution Strategies , 2014, ICORES.

[4]  Anne Auger,et al.  Principled Design of Continuous Stochastic Search: From Theory to Practice , 2014, Theory and Principled Methods for the Design of Metaheuristics.

[5]  Bernhard Sendhoff,et al.  Covariance Matrix Adaptation Revisited - The CMSA Evolution Strategy - , 2008, PPSN.

[6]  Tom Schaul,et al.  Exponential natural evolution strategies , 2010, GECCO '10.

[7]  Anne Auger,et al.  Real-Parameter Black-Box Optimization Benchmarking 2009: Noiseless Functions Definitions , 2009 .

[8]  Hans-Georg Beyer,et al.  The Theory of Evolution Strategies , 2001, Natural Computing Series.

[9]  Nikolaus Hansen,et al.  Completely Derandomized Self-Adaptation in Evolution Strategies , 2001, Evolutionary Computation.

[10]  Michèle Sebag,et al.  Alternative Restart Strategies for CMA-ES , 2012, PPSN.

[11]  Hans-Georg Beyer,et al.  The Dynamics of Cumulative Step Size Adaptation on the Ellipsoid Model , 2016, Evolutionary Computation.

[12]  Oswin Krause,et al.  CMA-ES with Optimal Covariance Update and Storage Complexity , 2016, NIPS.

[13]  Christian Igel,et al.  A computational efficient covariance matrix update and a (1+1)-CMA for evolution strategies , 2006, GECCO.

[14]  Nikolaus Hansen,et al.  Benchmarking a BI-population CMA-ES on the BBOB-2009 function testbed , 2009, GECCO '09.

[15]  Olivier Ledoit,et al.  Honey, I Shrunk the Sample Covariance Matrix , 2003 .

[16]  Ilya Loshchilov,et al.  CMA-ES with restarts for solving CEC 2013 benchmark problems , 2013, 2013 IEEE Congress on Evolutionary Computation.

[17]  Dirk V. Arnold,et al.  Improving Evolution Strategies through Active Covariance Matrix Adaptation , 2006, 2006 IEEE International Conference on Evolutionary Computation.

[18]  Anne Auger,et al.  Performance evaluation of an advanced local search evolutionary algorithm , 2005, 2005 IEEE Congress on Evolutionary Computation.

[19]  Oswin Krause,et al.  A CMA-ES with Multiplicative Covariance Matrix Updates , 2015, GECCO.