Combining CMA-ES and MOEA/DD for many-objective optimization

Multi-objective Estimation of Distribution Algorithms (MOEDAS) have been successfully applied to solve Multi-objective Optimization Problems (MOPs) since they are able to model dependencies between variables of the problem and then sample new solutions to guide the search to promising areas. A state-of-the-art optimizer for single-objective continuous functions that also uses probabilistic modeling is the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). Different variants of CMA-ES have been proposed for MOPs however most of them are based on Pareto dominance as the main selection criterion. Recently, a new multi-objective CMA-ES called MOEA/D-CMA was proposed combining the strengths of CMA-ES with those of the multi-objective evolutionary algorithm based on decomposition (MOEA/D). Nowadays, however, researchers on MOEAs agree that combining Pareto and decomposition can be beneficial for the search on MOPs. As a result, a new MOEA has been proposed, called MOEA/DD. This algorithm modifies the MOEA/D by including a new Pareto dominance update mechanism that brings more diversity into the search. In this study, we extend the MOEA/D-CMA by replacing its update mechanism by the one of MOEA/DD. The hypothesis is that this update mechanism will improve the performance of MOEA/D-CMA as it improved MOEA/D. MOEA/D-CMA and MOEA/DD-CMA are implemented and evaluated through an experimental study. The experimental study involves two well-known families of benchmark problems whose objective numbers scale from two to fifteen. Then, an extensive statistical analysis of the results is made to extract sound, statistically supported conclusions about the performance of the algorithms as the number of objectives scales.

[1]  Kalyanmoy Deb,et al.  A Fast Elitist Non-dominated Sorting Genetic Algorithm for Multi-objective Optimisation: NSGA-II , 2000, PPSN.

[2]  R. Lyndon While,et al.  A review of multiobjective test problems and a scalable test problem toolkit , 2006, IEEE Transactions on Evolutionary Computation.

[3]  Hans-Paul Schwefel,et al.  Evolution strategies – A comprehensive introduction , 2002, Natural Computing.

[4]  Kalyanmoy Deb,et al.  A fast and elitist multiobjective genetic algorithm: NSGA-II , 2002, IEEE Trans. Evol. Comput..

[5]  Gary B. Lamont,et al.  Evolutionary Algorithms for Solving Multi-Objective Problems (Genetic and Evolutionary Computation) , 2006 .

[6]  Carlos A. Coello Coello,et al.  Using the Averaged Hausdorff Distance as a Performance Measure in Evolutionary Multiobjective Optimization , 2012, IEEE Transactions on Evolutionary Computation.

[7]  Saúl Zapotecas Martínez,et al.  Injecting CMA-ES into MOEA/D , 2015, GECCO.

[8]  Christian Igel,et al.  Steady-State Selection and Efficient Covariance Matrix Update in the Multi-objective CMA-ES , 2007, EMO.

[9]  Carlos A. Coello Coello,et al.  Solving Multiobjective Optimization Problems Using an Artificial Immune System , 2005, Genetic Programming and Evolvable Machines.

[10]  Qingfu Zhang,et al.  MOEA/D: A Multiobjective Evolutionary Algorithm Based on Decomposition , 2007, IEEE Transactions on Evolutionary Computation.

[11]  Aurora Trinidad Ramirez Pozo,et al.  C-Multi: A competent multi-swarm approach for many-objective problems , 2016, Neurocomputing.

[12]  Kalyanmoy Deb,et al.  An Evolutionary Many-Objective Optimization Algorithm Using Reference-Point-Based Nondominated Sorting Approach, Part I: Solving Problems With Box Constraints , 2014, IEEE Transactions on Evolutionary Computation.

[13]  Qingfu Zhang,et al.  Multiobjective Optimization Problems With Complicated Pareto Sets, MOEA/D and NSGA-II , 2009, IEEE Transactions on Evolutionary Computation.

[14]  Nikolaus Hansen,et al.  A restart CMA evolution strategy with increasing population size , 2005, 2005 IEEE Congress on Evolutionary Computation.

[15]  W. Kruskal,et al.  Use of Ranks in One-Criterion Variance Analysis , 1952 .

[16]  Nikolaus Hansen,et al.  Adapting arbitrary normal mutation distributions in evolution strategies: the covariance matrix adaptation , 1996, Proceedings of IEEE International Conference on Evolutionary Computation.

[17]  Nicola Beume,et al.  Scalarization versus indicator-based selection in multi-objective CMA evolution strategies , 2008, 2008 IEEE Congress on Evolutionary Computation (IEEE World Congress on Computational Intelligence).

[18]  Kalyanmoy Deb,et al.  Faster Hypervolume-Based Search Using Monte Carlo Sampling , 2008, MCDM.

[19]  Aurora Trinidad Ramirez Pozo,et al.  MOEA/D-GM: Using probabilistic graphical models in MOEA/D for solving combinatorial optimization problems , 2015, ArXiv.

[20]  Aurora Trinidad Ramirez Pozo,et al.  Transfer weight functions for injecting problem information in the multi-objective CMA-ES , 2016, Memetic Computing.

[21]  Stefan Roth,et al.  Covariance Matrix Adaptation for Multi-objective Optimization , 2007, Evolutionary Computation.

[22]  Lucas Bradstreet,et al.  A Fast Way of Calculating Exact Hypervolumes , 2012, IEEE Transactions on Evolutionary Computation.

[23]  Concha Bielza,et al.  A review on probabilistic graphical models in evolutionary computation , 2012, J. Heuristics.

[24]  Nikolaus Hansen,et al.  Injecting External Solutions Into CMA-ES , 2011, ArXiv.

[25]  Marco Laumanns,et al.  Scalable Test Problems for Evolutionary Multiobjective Optimization , 2005, Evolutionary Multiobjective Optimization.

[26]  R. E. Lee,et al.  Distribution-free multiple comparisons between successive treatments , 1995 .