Standard Error Dynamic Resampling for Preference-based Evolutionary Multi-objective Optimization

In Preference-based Evolutionary Multi-objective Optimization, the decision maker is looking for alternative solutions in a limited area of the objective space. In this area, which is of special interest to the decision maker, the goal is to find a diverse, but locally focused non-dominated front. The preference information allows to focus the optimization efforts on this most relevant part of the Pareto-front, and thereby achieve a better optimization result. Simulation-based optimization is often used to model stochastic systems, and a popular method of handling their objective noise is Dynamic Resampling. The given preference information allows for better Dynamic Resampling strategies to be defined, which further improve the optimization result. In our previous work, we proposed hybrid Dynamic Resampling strategies that base their sampling allocation on multiple resampling criteria. These criteria can be, for example, the elapsed optimization time, the Pareto-rank of a solution, or its distance to the decision maker’s area of interest. In this article, the standard error of the mean objective values of solutions is introduced as a resampling criterion for multi-objective optimization, and several variance-based, hybrid Dynamic Resampling algorithms are proposed and compared with results of non-variance-based Dynamic Resampling algorithms, presented by the authors in [20, 21, 22]. For this purpose, a multi-objective extension of the Standard Error Dynamic Resampling algorithm [11] is proposed. The new variance-based hybrid Dynamic Resampling strategies are evaluated together with the Reference point-guided NSGA-II optimization algorithm (R-NSGAII) on variants of a multi-objective benchmark function with stochastic objectives. The noise strength of these new multi-objective benchmark problems varies throughout the objective space; a function property called Noise Landscape. The algorithm performance is evaluated on noise landscapes with different types of challenges.

[1]  R. Lyndon While,et al.  Applying evolutionary algorithms to problems with noisy, time-consuming fitness functions , 2004, Proceedings of the 2004 Congress on Evolutionary Computation (IEEE Cat. No.04TH8753).

[2]  Thomas Bartz-Beielstein,et al.  Particle Swarm Optimization and Sequential Sampling in Noisy Environments , 2007, Metaheuristics.

[3]  Jonathan E. Fieldsend,et al.  The Rolling Tide Evolutionary Algorithm: A Multiobjective Optimizer for Noisy Optimization Problems , 2015, IEEE Transactions on Evolutionary Computation.

[4]  Xiaodong Li,et al.  A new performance metric for user-preference based multi-objective evolutionary algorithms , 2013, 2013 IEEE Congress on Evolutionary Computation.

[5]  Kalyanmoy Deb,et al.  Dynamic Resampling for Preference-based Evolutionary Multi-Objective Optimization of Stochastic Systems , 2015, MCDM 2015.

[6]  Kaisa Miettinen,et al.  Nonlinear multiobjective optimization , 1998, International series in operations research and management science.

[7]  Kay Chen Tan,et al.  Handling Uncertainties in Evolutionary Multi-Objective Optimization , 2008, WCCI.

[8]  Anthony Di Pietro Optimising evolutionary strategies for problems with varying noise strength , 2007 .

[9]  Kalyanmoy Deb,et al.  An Interactive Evolutionary Multiobjective Optimization Method Based on Progressively Approximated Value Functions , 2010, IEEE Transactions on Evolutionary Computation.

[10]  Kalyanmoy Deb,et al.  Reference point based multi-objective optimization using evolutionary algorithms , 2006, GECCO.

[11]  Kalyanmoy Deb,et al.  A comparative study of dynamic resampling strategies for guided Evolutionary Multi-objective Optimization , 2013, 2013 IEEE Congress on Evolutionary Computation.

[12]  Jürgen Branke,et al.  Sequential Sampling in Noisy Environments , 2004, PPSN.

[13]  Robert Ivor John,et al.  Evolutionary optimisation of noisy multi-objective problems using confidence-based dynamic resampling , 2010, Eur. J. Oper. Res..

[14]  Kalyanmoy Deb,et al.  Hybrid Dynamic Resampling for Guided Evolutionary Multi-Objective Optimization , 2015, EMO.

[15]  Loo Hay Lee,et al.  Stochastic Simulation Optimization - An Optimal Computing Budget Allocation , 2010, System Engineering and Operations Research.

[16]  Kalyanmoy Deb,et al.  A fast and elitist multiobjective genetic algorithm: NSGA-II , 2002, IEEE Trans. Evol. Comput..

[17]  Loo Hay Lee,et al.  Efficient Simulation Budget Allocation for Selecting an Optimal Subset , 2008, INFORMS J. Comput..

[18]  L. Lee,et al.  Design sampling and replication assignment under fixed computing budget , 2005 .

[19]  L. Lee,et al.  Finding the non-dominated Pareto set for multi-objective simulation models , 2010 .

[20]  Kwang Ryel Ryu,et al.  Accumulative sampling for noisy evolutionary multi-objective optimization , 2011, GECCO '11.

[21]  Kalyanmoy Deb,et al.  R-HV: A Metric for Computing Hyper-volume for Reference Point Based EMOs , 2014, SEMCCO.

[22]  Loo Hay Lee,et al.  Multi-objective simulation-based evolutionary algorithm for an aircraft spare parts allocation problem , 2008, Eur. J. Oper. Res..

[23]  Carlos A. Coello Coello,et al.  A Study of the Parallelization of a Coevolutionary Multi-objective Evolutionary Algorithm , 2004, MICAI.

[24]  Lothar Thiele,et al.  Comparison of Multiobjective Evolutionary Algorithms: Empirical Results , 2000, Evolutionary Computation.

[25]  Timothy W. Simpson,et al.  Visual Steering Commands for Trade Space Exploration: User-Guided Sampling With Example , 2009, J. Comput. Inf. Sci. Eng..

[26]  Kay Chen Tan,et al.  Evolutionary Multi-objective Optimization in Uncertain Environments - Issues and Algorithms , 2009, Studies in Computational Intelligence.

[27]  A. Tsoularis,et al.  Analysis of logistic growth models. , 2002, Mathematical biosciences.