Reducing Local Optima in Single-Objective Problems by Multi-objectivization

One common characterization of how simple hill-climbing optimization methods can fail is that they become trapped in local optima - a state where no small modification of the current best solution will produce a solution that is better. This measure of 'better' depends on the performance of the solution with respect to the single objective being optimized. In contrast, multi-objective optimization (MOO) involves the simultaneous optimization of a number of objectives. Accordingly, the multi-objective notion of 'better' permits consideration of solutions that may be superior in one objective but not in another. Intuitively, we may say that this gives a hill-climber in multi-objective space more freedom to explore and less likelihood of becoming trapped. In this paper, we investigate this intuition by comparing the performance of simple hill-climber-style algorithms on single-objective problems and multi-objective versions of those same problems. Using an abstract building-block problem we illustrate how 'multi-objectivizing' a single-objective optimization (SOO) problem can remove local optima. Then we investigate small instances of the travelling salesman problem where additional objectives are defined using arbitrary sub-tours. Results indicate that multi-objectivization can reduce local optima and facilitate improved optimization in some cases. These results enlighten our intuitions about the nature of search in multi-objective optimization and sources of difficulty in single-objective optimization.

[1]  R. Bellman DYNAMIC PROGRAMMING AND MULTI-STAGE DECISION PROCESSES OF STOCHASTIC TYPE , 1954 .

[2]  M. M. Flood The Traveling-Salesman Problem , 1956 .

[3]  D. Goldberg Research Note: Long Path Problems for Mutation-based Algorithms Long Path Problems for Mutation-based Algorithms , 1992 .

[4]  DeceptionSushil J. LouisDepartment Pareto Optimality, Ga-easiness and Deception , 1993 .

[5]  Sushil J. Louis,et al.  Pareto OptimalityGA-Easiness and Deception (Extended Abstract) , 1993, International Conference on Genetic Algorithms.

[6]  P. Moscato,et al.  The euclidean traveling salesman problem and a space-filling curve , 1995 .

[7]  Tim Jones Evolutionary Algorithms, Fitness Landscapes and Search , 1995 .

[8]  M. Huynen,et al.  Smoothness within ruggedness: the role of neutrality in adaptation. , 1996, Proceedings of the National Academy of Sciences of the United States of America.

[9]  Samir W. Mahfoud Niching methods for genetic algorithms , 1996 .

[10]  Jordan B. Pollack,et al.  Modeling Building-Block Interdependency , 1998, PPSN.

[11]  David Corne,et al.  The Pareto archived evolution strategy: a new baseline algorithm for Pareto multiobjective optimisation , 1999, Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406).

[12]  Gary B. Lamont,et al.  Multiobjective optimization with messy genetic algorithms , 2000, SAC '00.

[13]  Jordan B. Pollack,et al.  Symbiotic Combination as an Alternative to Sexual Recombination in Genetic Algorithms , 2000, PPSN.

[14]  Martin J. Oates,et al.  The Pareto Envelope-Based Selection Algorithm for Multi-objective Optimisation , 2000, PPSN.

[15]  David W. Corne,et al.  Approximating the Nondominated Front Using the Pareto Archived Evolution Strategy , 2000, Evolutionary Computation.

[16]  Richard A. Watson,et al.  Analysis of recombinative algorithms on a non-separable building-block problem , 2000, FOGA.

[17]  W. J. DeCoursey,et al.  Introduction: Probability and Statistics , 2003 .