Benchmarking Parameter-Free AMaLGaM on Functions With and Without Noise

We describe a parameter-free estimation-of-distribution algorithm (EDA) called the adapted maximum-likelihood Gaussian model iterated density-estimation evolutionary algorithm (AMaLGaM-IDA, or AMaLGaM for short) for numerical optimization. AMaLGaM is benchmarked within the 2009 black box optimization benchmarking (BBOB) framework and compared to a variant with incremental model building (iAMaLGaM). We study the implications of factorizing the covariance matrix in the Gaussian distribution, to use only a few or no covariances. Further, AMaLGaM and iAMaLGaM are also evaluated on the noisy BBOB problems and we assess how well multiple evaluations per solution can average out noise. Experimental evidence suggests that parameter-free AMaLGaM can solve a wide range of problems efficiently with perceived polynomial scalability, including multimodal problems, obtaining the best or near-best results among all algorithms tested in 2009 on functions such as the step-ellipsoid and Katsuuras, but failing to locate the optimum within the time limit on skew Rastrigin-Bueche separable and Lunacek bi-Rastrigin in higher dimensions. AMaLGaM is found to be more robust to noise than iAMaLGaM due to the larger required population size. Using few or no covariances hinders the EDA from dealing with rotations of the search space. Finally, the use of noise averaging is found to be less efficient than the direct application of the EDA unless the noise is uniformly distributed. AMaLGaM was among the best performing algorithms submitted to the BBOB workshop in 2009.

[1]  Michèle Sebag,et al.  Extending Population-Based Incremental Learning to Continuous Search Spaces , 1998, PPSN.

[2]  Franz Rothlauf,et al.  The correlation-triggered adaptive variance scaling IDEA , 2006, GECCO.

[3]  Franz Rothlauf,et al.  SDR: a better trigger for adaptive variance scaling in normal EDAs , 2007, GECCO '07.

[4]  T. W. Anderson,et al.  An Introduction to Multivariate Statistical Analysis , 1959 .

[5]  D. Goldberg,et al.  BOA: the Bayesian optimization algorithm , 1999 .

[6]  Marcus Gallagher,et al.  On the importance of diversity maintenance in estimation of distribution algorithms , 2005, GECCO '05.

[7]  Raymond Ros,et al.  Real-Parameter Black-Box Optimization Benchmarking 2009: Experimental Setup , 2009 .

[8]  Nikolaus Hansen,et al.  The CMA Evolution Strategy: A Comparing Review , 2006, Towards a New Evolutionary Computation.

[9]  Anja Vogler,et al.  An Introduction to Multivariate Statistical Analysis , 2004 .

[10]  Donald E. Knuth,et al.  The art of computer programming. Vol.2: Seminumerical algorithms , 1981 .

[11]  Petros Koumoutsakos,et al.  A Mixed Bayesian Optimization Algorithm with Variance Adaptation , 2004, PPSN.

[12]  Dirk Thierens,et al.  Expanding from Discrete to Continuous Estimation of Distribution Algorithms: The IDEA , 2000, PPSN.

[13]  Pedro Larrañaga,et al.  Optimization in Continuous Domains by Learning and Simulation of Gaussian Networks , 2000 .

[14]  Anne Auger,et al.  Real-Parameter Black-Box Optimization Benchmarking 2009: Noiseless Functions Definitions , 2009 .

[15]  Martin Pelikan,et al.  Scalable Optimization via Probabilistic Modeling , 2006, Studies in Computational Intelligence.

[16]  Nikolaus Hansen,et al.  A restart CMA evolution strategy with increasing population size , 2005, 2005 IEEE Congress on Evolutionary Computation.

[17]  Hua Xu,et al.  Cross entropy and adaptive variance scaling in continuous EDA , 2007, GECCO '07.

[18]  Martin Pelikan,et al.  Scalable Optimization via Probabilistic Modeling: From Algorithms to Applications (Studies in Computational Intelligence) , 2006 .

[19]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[20]  Alex A. Freitas,et al.  Evolutionary Computation , 2002 .

[21]  Heinz Mühlenbein,et al.  FDA -A Scalable Evolutionary Algorithm for the Optimization of Additively Decomposed Functions , 1999, Evolutionary Computation.

[22]  Dirk Thierens,et al.  Enhancing the Performance of Maximum-Likelihood Gaussian EDAs Using Anticipated Mean Shift , 2008, PPSN.

[23]  J. A. Lozano,et al.  Towards a New Evolutionary Computation: Advances on Estimation of Distribution Algorithms (Studies in Fuzziness and Soft Computing) , 2006 .

[24]  Pedro Larrañaga,et al.  Towards a New Evolutionary Computation - Advances in the Estimation of Distribution Algorithms , 2006, Towards a New Evolutionary Computation.

[25]  Peter A. N. Bosman,et al.  On empirical memory design, faster selection of bayesian factorizations and parameter-free gaussian EDAs , 2009, GECCO.

[26]  T. Gerig Multivariate Analysis: Techniques for Educational and Psychological Research , 1975 .