A Comparative Study of Large-Scale Variants of CMA-ES

The CMA-ES is one of the most powerful stochastic numerical optimizers to address difficult black-box problems. Its intrinsic time and space complexity is quadratic—limiting its applicability with increasing problem dimensionality. To circumvent this limitation, different large-scale variants of CMA-ES with subquadratic complexity have been proposed over the past ten years. To-date however, these variants have been tested and compared only in rather restrictive settings, due to the lack of a comprehensive large-scale testbed to assess their performance. In this context, we introduce a new large-scale testbed with dimension up to 640, implemented within the COCO benchmarking platform. We use this testbed to assess the performance of several promising variants of CMA-ES and the standard limited-memory L-BFGS. In all tested dimensions, the best CMA-ES variant solves more problems than L-BFGS for larger budgets while L-BFGS outperforms the best CMA-ES variant for smaller budgets. However, over all functions, the cumulative runtime distributions between L-BFGS and the best CMA-ES variants are close (less than a factor of 4 in high dimension).

[1]  Oswin Krause,et al.  CMA-ES with Optimal Covariance Update and Storage Complexity , 2016, NIPS.

[2]  Hans-Georg Beyer,et al.  Limited-Memory Matrix Adaptation for Large Scale Black-box Optimization , 2017, ArXiv.

[3]  Tom Schaul,et al.  A linear time natural evolution strategy for non-separable functions , 2011, GECCO.

[4]  Jorge Nocedal,et al.  On the limited memory BFGS method for large scale optimization , 1989, Math. Program..

[5]  Ilya Loshchilov,et al.  LM-CMA: An Alternative to L-BFGS for Large-Scale Black Box Optimization , 2015, Evolutionary Computation.

[6]  Youhei Akimoto,et al.  Online Model Selection for Restricted Covariance Matrix Adaptation , 2016, PPSN.

[7]  Youhei Akimoto,et al.  Projection-Based Restricted Covariance Matrix Adaptation for High Dimension , 2016, GECCO.

[8]  Qingfu Zhang,et al.  A Simple Yet Efficient Evolution Strategy for Large-Scale Black-Box Optimization , 2018, IEEE Transactions on Evolutionary Computation.

[9]  Ponnuthurai Nagaratnam Suganthan,et al.  Benchmark Functions for the CEC'2013 Special Session and Competition on Large-Scale Global Optimization , 2008 .

[10]  Anne Auger,et al.  COCO: a platform for comparing continuous optimizers in a black-box setting , 2016, Optim. Methods Softw..

[11]  Raymond Ros,et al.  A Simple Modification in CMA-ES Achieving Linear Time and Space Complexity , 2008, PPSN.

[12]  Anne Auger,et al.  Real-Parameter Black-Box Optimization Benchmarking 2009: Noiseless Functions Definitions , 2009 .

[13]  Christian Igel,et al.  Efficient covariance matrix update for variable metric evolution strategies , 2009, Machine Learning.

[14]  Ilya Loshchilov,et al.  A computationally efficient limited memory CMA-ES for large scale optimization , 2014, GECCO.

[15]  Nikolaus Hansen,et al.  Completely Derandomized Self-Adaptation in Evolution Strategies , 2001, Evolutionary Computation.

[16]  Anne Auger,et al.  Permuted Orthogonal Block-Diagonal Transformation Matrices for Large Scale Optimization Benchmarking , 2016, GECCO.

[17]  James N. Knight,et al.  Reducing the space-time complexity of the CMA-ES , 2007, GECCO '07.