Constrained many-objective optimization: A way forward

Many objective optimization is a natural extension to multi-objective optimization where the number of objectives are significantly more than five. The performance of current state of the art algorithms (e.g. NSGA-II, SPEA2) is known to deteriorate significantly with increasing number of objectives due to the lack of adequate convergence pressure. It is of no surprise that the performance of NSGA-II on some constrained many-objective optimization problems [7] (e.g., DTLZ5-(5,M), M = 10, 20) in an earlier study [18] was far from satisfactory. Till date, research in many-objective optimization has focussed on two major areas (a) dimensionality reduction in the objective space and (b) preference ordering based approaches. This paper introduces a novel evolutionary algorithm powered by epsilon dominance (implemented within the framework of NSGA-II) and controlled infeasibility for improved convergence while the critical set of objectives is identified through a nonlinear dimensionality reduction scheme. Since approaching the Pareto-optimal front from within the feasible search space will need to overcome the problems associated with low selection pressure, the mechanism to approach the front from within the infeasible search space is promising as illustrated in this paper. The performance of the proposed algorithm is compared with NSGA-II (original, with crowding distance measure) and NSGA-II (epsilon dominance) on the above set of constrained multiobjective problems to highlight the benefits.

[1]  Tapabrata Ray,et al.  Infeasibility Driven Evolutionary Algorithm (IDEA) for Engineering Design Optimization , 2008, Australasian Conference on Artificial Intelligence.

[2]  Eckart Zitzler,et al.  Improving hypervolume-based multiobjective evolutionary algorithms by using objective reduction methods , 2007, 2007 IEEE Congress on Evolutionary Computation.

[3]  Eckart Zitzler,et al.  On Objective Conflicts and Objective Reduction in Multiple Criteria Optimization , 2006 .

[4]  Kilian Q. Weinberger,et al.  Learning a kernel matrix for nonlinear dimensionality reduction , 2004, ICML.

[5]  Marco Laumanns,et al.  Scalable Test Problems for Evolutionary Multiobjective Optimization , 2005, Evolutionary Multiobjective Optimization.

[6]  H. Kita,et al.  Failure of Pareto-based MOEAs: does non-dominated really mean near to optimal? , 2001, Proceedings of the 2001 Congress on Evolutionary Computation (IEEE Cat. No.01TH8546).

[7]  Kilian Q. Weinberger,et al.  Unsupervised Learning of Image Manifolds by Semidefinite Programming , 2004, CVPR.

[8]  Kilian Q. Weinberger,et al.  An Introduction to Nonlinear Dimensionality Reduction by Maximum Variance Unfolding , 2006, AAAI.

[9]  Peter J. Bentley,et al.  Finding Acceptable Solutions in the Pareto-Optimal Range using Multiobjective Genetic Algorithms , 1998 .

[10]  Kilian Q. Weinberger,et al.  Spectral Methods for Dimensionality Reduction , 2006, Semi-Supervised Learning.

[11]  Qingfu Zhang,et al.  MOEA/D: A Multiobjective Evolutionary Algorithm Based on Decomposition , 2007, IEEE Transactions on Evolutionary Computation.

[12]  Kalyanmoy Deb,et al.  Dimensionality reduction of objectives and constraints in multi-objective optimization problems: A system design perspective , 2008, 2008 IEEE Congress on Evolutionary Computation (IEEE World Congress on Computational Intelligence).

[13]  Marco Laumanns,et al.  Combining Convergence and Diversity in Evolutionary Multiobjective Optimization , 2002, Evolutionary Computation.

[14]  Eckart Zitzler,et al.  Are All Objectives Necessary? On Dimensionality Reduction in Evolutionary Multiobjective Optimization , 2006, PPSN.

[15]  Eckart Zitzler,et al.  Indicator-Based Selection in Multiobjective Search , 2004, PPSN.

[16]  Tapabrata Ray,et al.  Infeasibility Driven Evolutionary Algorithm for Constrained Optimization , 2009 .

[17]  R. K. Ursem Multi-objective Optimization using Evolutionary Algorithms , 2009 .

[18]  Kalyanmoy Deb,et al.  Non-linear Dimensionality Reduction Procedures for Certain Large-Dimensional Multi-objective Optimization Problems: Employing Correntropy and a Novel Maximum Variance Unfolding , 2007, EMO.

[19]  David W. Corne,et al.  Techniques for highly multiobjective optimisation: some nondominated points are better than others , 2007, GECCO '07.

[20]  Kiyoshi Tanaka,et al.  Controlling Dominance Area of Solutions and Its Impact on the Performance of MOEAs , 2007, EMO.

[21]  Rolf Drechsler,et al.  Multi-objective Optimisation Based on Relation Favour , 2001, EMO.

[22]  Kalyanmoy Deb,et al.  Trading on infeasibility by exploiting constraint’s criticality through multi-objectivization: A system design perspective , 2007, 2007 IEEE Congress on Evolutionary Computation.

[23]  Tapabrata Ray,et al.  A Study on the Performance of Substitute Distance Based Approaches for Evolutionary Many Objective Optimization , 2008, SEAL.

[24]  Hisao Ishibuchi,et al.  Comparison between Single-Objective and Multi-Objective Genetic Algorithms: Performance Comparison and Performance Measures , 2006, 2006 IEEE International Conference on Evolutionary Computation.

[25]  Tapabrata Ray,et al.  Blessings of maintaining infeasible solutions for constrained multi-objective optimization problems , 2008, 2008 IEEE Congress on Evolutionary Computation (IEEE World Congress on Computational Intelligence).

[26]  Mario Köppen,et al.  Substitute Distance Assignments in NSGA-II for Handling Many-objective Optimization Problems , 2007, EMO.

[27]  Hisao Ishibuchi,et al.  Incorporation of Scalarizing Fitness Functions into Evolutionary Multiobjective Optimization Algorithms , 2006, PPSN.

[28]  Kiyoshi Tanaka,et al.  Working principles, behavior, and performance of MOEAs on MNK-landscapes , 2007, Eur. J. Oper. Res..

[29]  José Carlos Príncipe,et al.  Nonlinear Component Analysis Based on Correntropy , 2006, The 2006 IEEE International Joint Conference on Neural Network Proceedings.