A Proposal to Hybridize Multi-Objective Evolutionary Algorithms with Non-gradient Mathematical Programming Techniques

The hybridization of multi-objective evolutionary algorithms (MOEAs) with mathematical programming techniques has gained increasing popularity in the specialized literature in the last few years. However, such hybrids normally rely on the use of gradients and, therefore, normally consume a high number of extra objective function evaluations in order to estimate the gradient information required. The use of direct (nonlinear) optimization techniques has been, however, less common in the specialized literature, although several hybrids of this sort have been proposed for single-objective evolutionary algorithms. This paper proposes a hybridization between a well-known MOEA (the NSGA-II) and two direct search methods (Nelder and Mead's method and the golden section algorithm). The aim of the proposed approach is to combine the global search mechanisms of the evolutionary algorithm with the local search mechanisms provided by the aforementioned mathematical programming techniques, such that a more efficient (i.e., with a lower number of objective function evaluations) approach can be produced.

[1]  J. Halton On the efficiency of certain quasi-random sequences of points in evaluating multi-dimensional integrals , 1960 .

[2]  Gary B. Lamont,et al.  Multiobjective evolutionary algorithms: classifications, analyses, and new innovations , 1999 .

[3]  Shengxiang Yang,et al.  Evolutionary Computation in Dynamic and Uncertain Environments , 2007, Studies in Computational Intelligence.

[4]  Marco Laumanns,et al.  Scalable Test Problems for Evolutionary Multiobjective Optimization , 2005, Evolutionary Multiobjective Optimization.

[5]  J. Hammersley MONTE CARLO METHODS FOR SOLVING MULTIVARIABLE PROBLEMS , 1960 .

[6]  Kalyanmoy Deb,et al.  A fast and elitist multiobjective genetic algorithm: NSGA-II , 2002, IEEE Trans. Evol. Comput..

[7]  Lothar Thiele,et al.  Comparison of Multiobjective Evolutionary Algorithms: Empirical Results , 2000, Evolutionary Computation.

[8]  Pradyumn Kumar Shukla,et al.  On Gradient Based Local Search Methods in Unconstrained Evolutionary Multi-objective Optimization , 2007, EMO.

[9]  M. Rahman,et al.  An intelligent moving object optimization algorithm for design problems with mixed variables, mixed constraints and multiple objectives , 2006 .

[10]  Mohamed B. Trabia,et al.  A Fuzzy Adaptive Simplex Search Optimization Algorithm , 1999, DAC 1999.

[11]  Jeffrey C. Lagarias,et al.  Convergence Properties of the Nelder-Mead Simplex Method in Low Dimensions , 1998, SIAM J. Optim..

[12]  Marco Antonio Luersen,et al.  Globalized Nelder-Mead method for engineering optimization , 2002 .

[13]  Raphael T. Haftka,et al.  Surrogate Model-Based Optimization Framework: A Case Study in Aerospace Design , 2007, Evolutionary Computation in Dynamic and Uncertain Environments.

[14]  Lakhmi C. Jain,et al.  Evolutionary Multiobjective Optimization , 2005, Evolutionary Multiobjective Optimization.

[15]  Gary B. Lamont,et al.  Evolutionary Algorithms for Solving Multi-Objective Problems , 2002, Genetic Algorithms and Evolutionary Computation.

[16]  Jason R. Schott Fault Tolerant Design Using Single and Multicriteria Genetic Algorithm Optimization. , 1995 .

[17]  A. Ravindran,et al.  Engineering Optimization: Methods and Applications , 2006 .

[18]  K. I. M. McKinnon,et al.  Convergence of the Nelder-Mead Simplex Method to a Nonstationary Point , 1998, SIAM J. Optim..

[19]  G. R. Hext,et al.  Sequential Application of Simplex Designs in Optimisation and Evolutionary Operation , 1962 .

[20]  Gary B. Lamont,et al.  Evolutionary Algorithms for Solving Multi-Objective Problems (Genetic and Evolutionary Computation) , 2006 .

[21]  John A. Nelder,et al.  A Simplex Method for Function Minimization , 1965, Comput. J..