Surrogate-based Multi-Objective Particle Swarm Optimization

This paper presents a new algorithm that approximates real function evaluations using supervised learning with a surrogate method called support vector machine (SVM). We perform a comparative study among different leader selection schemes in a multi-objective particle swarm optimizer (MOPSO), in order to determine the most appropriate approach to be adopted for solving the sort of problems of our interest. The resulting hybrid presents a poor spread of solutions, which motivates the introduction of a second phase to our algorithm, in which an approach called rough sets is adopted in order to improve the spread of solutions along the Pareto front. Rough sets are used as a local search engine, which is able to generate solutions in the neighborhood of the nondominated solutions previously generated by the surrogate-based algorithm. The resulting approach is able to generate reasonably good approximations of the Pareto front of problems of up to 30 decision variables with only 2,000 fitness function evaluations. Our results are compared with respect to the NSGA-II, which is a multi-objective evolutionary algorithm representative of the state-of-the-art in the area.

[1]  R. Fletcher Practical Methods of Optimization , 1988 .

[2]  David E. Goldberg,et al.  Genetic Algorithms in Search Optimization and Machine Learning , 1988 .

[3]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .

[4]  Gary B. Lamont,et al.  Multiobjective evolutionary algorithms: classifications, analyses, and new innovations , 1999 .

[5]  Lothar Thiele,et al.  Comparison of Multiobjective Evolutionary Algorithms: Empirical Results , 2000, Evolutionary Computation.

[6]  Richard J. Beckman,et al.  A Comparison of Three Methods for Selecting Values of Input Variables in the Analysis of Output From a Computer Code , 2000, Technometrics.

[7]  Kalyanmoy Deb,et al.  A fast and elitist multiobjective genetic algorithm: NSGA-II , 2002, IEEE Trans. Evol. Comput..

[8]  C.A. Coello Coello,et al.  MOPSO: a proposal for multiple objective particle swarm optimization , 2002, Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600).

[9]  M.N. Vrahatis,et al.  Particle swarm optimizers for Pareto optimization with enhanced archiving techniques , 2003, The 2003 Congress on Evolutionary Computation, 2003. CEC '03..

[10]  A. Keane,et al.  Evolutionary Optimization of Computationally Expensive Problems via Surrogate Modeling , 2003 .

[11]  Bernhard Schölkopf,et al.  A tutorial on support vector regression , 2004, Stat. Comput..

[12]  Jonathan E. Fieldsend,et al.  A MOPSO Algorithm Based Exclusively on Pareto Dominance Concepts , 2005, EMO.

[13]  Marios K. Karakasis,et al.  METAMODEL-ASSISTED MULTI-OBJECTIVE EVOLUTIONARY OPTIMIZATION , 2005 .

[14]  Yaochu Jin,et al.  A comprehensive survey of fitness approximation in evolutionary computation , 2005, Soft Comput..

[15]  Joshua D. Knowles,et al.  ParEGO: a hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems , 2006, IEEE Transactions on Evolutionary Computation.

[16]  Carlos A. Coello Coello,et al.  Pareto-adaptive -dominance , 2007, Evolutionary Computation.

[17]  Mauro Birattari,et al.  Swarm Intelligence , 2012, Lecture Notes in Computer Science.

[18]  R. K. Ursem Multi-objective Optimization using Evolutionary Algorithms , 2009 .

[19]  Andy J. Keane,et al.  Multi-Objective Optimization Using Surrogates , 2010 .