Multi-objective optimization with diversity preserving mixture-based iterated density estimation evolutionary algorithms

Stochastic optimization by learning and using probabilistic models has received an increasing amount of attention over the last few years. Algorithms within this field estimate the probability distribution of a selection of the available solutions and subsequently draw more samples from the estimated probability distribution. The resulting algorithms have displayed a good performance on a wide variety of single-objective optimization problems, both for binary as well as for real-valued variables. Mixture distributions offer a powerful tool for modeling complicated dependencies between the problem variables. Moreover, they allow for elegant and parallel exploration of a multi-objective front. This parallel exploration aids the important preservation of diversity in multi-objective optimization. In this paper, we propose a new algorithm for evolutionary multi-objective optimization by learning and using probabilistic mixture distributions. We name this algorithm Multi-objective Mixture-based Iterated Density Estimation Evolutionary Algorithm (MIDEA). To further improve and maintain the diversity that is obtained by the mixture distribution, we use a specialized diversity preserving selection operator. We verify the effectiveness of our approach in two different problem domains and compare it with two other well-known efficient multi-objective evolutionary algorithms.

[2]  Pedro Larrañaga,et al.  Estimation of Distribution Algorithms , 2002, Genetic Algorithms and Evolutionary Computation.

[3]  Dirk Thierens,et al.  Advancing continuous IDEAs with mixture distributions and factorization selection metrics , 2001 .

[4]  Michael I. Jordan,et al.  Estimating Dependency Structure as a Hidden Variable , 1997, NIPS.

[5]  C.H.M. vanKemenade Building block filtering and mixing , 1998 .

[6]  Dirk Thierens,et al.  Linkage Information Processing In Distribution Estimation Algorithms , 1999, GECCO.

[7]  David E. Goldberg,et al.  Genetic Algorithms, Clustering, and the Breaking of Symmetry , 2000, PPSN.

[8]  Michèle Sebag,et al.  Extending Population-Based Incremental Learning to Continuous Search Spaces , 1998, PPSN.

[9]  James Kennedy,et al.  Proceedings of the 1998 IEEE International Conference on Evolutionary Computation [Book Review] , 1999, IEEE Transactions on Evolutionary Computation.

[10]  Hitoshi Iba,et al.  Genetic Programming 1998: Proceedings of the Third Annual Conference , 1999, IEEE Trans. Evol. Comput..

[11]  Lothar Thiele,et al.  Comparison of Multiobjective Evolutionary Algorithms: Empirical Results , 2000, Evolutionary Computation.

[12]  C. A. Coello Coello,et al.  A Comprehensive Survey of Evolutionary-Based Multiobjective Optimization Techniques , 1999, Knowledge and Information Systems.

[13]  C. N. Liu,et al.  Approximating discrete probability distributions with dependence trees , 1968, IEEE Trans. Inf. Theory.

[14]  Kalyanmoy Deb,et al.  A Fast Elitist Non-dominated Sorting Genetic Algorithm for Multi-objective Optimisation: NSGA-II , 2000, PPSN.

[15]  Heinz Mühlenbein,et al.  FDA -A Scalable Evolutionary Algorithm for the Optimization of Additively Decomposed Functions , 1999, Evolutionary Computation.

[17]  Xin Yao,et al.  Parallel Problem Solving from Nature PPSN VI , 2000, Lecture Notes in Computer Science.

[18]  Fernando G. Lobo,et al.  Compressed introns in a linkage learning genetic algorithm , 1998 .

[19]  H. Mühlenbein,et al.  From Recombination of Genes to the Estimation of Distributions I. Binary Parameters , 1996, PPSN.

[20]  Lothar Thiele,et al.  Multiobjective evolutionary algorithms: a comparative case study and the strength Pareto approach , 1999, IEEE Trans. Evol. Comput..

[21]  Rajkumar Roy,et al.  Advances in Soft Computing: Engineering Design and Manufacturing , 1998 .

[22]  G. Schwarz Estimating the Dimension of a Model , 1978 .

[23]  P. Bosman,et al.  Negative log-likelihood and statistical hypothesis testing as the basis of model selection in IDEAs , 2000 .

[24]  Xavier Gandibleux,et al.  An Annotated Bibliography of Multiobjective Combinatorial Optimization , 2000 .

[25]  Marcus Gallagher,et al.  Real-valued Evolutionary Optimization using a Flexible Probability Density Estimator , 1999, GECCO.

[26]  D. Edwards Introduction to graphical modelling , 1995 .

[27]  D. Goldberg,et al.  BOA: the Bayesian optimization algorithm , 1999 .

[28]  Pedro Larrañaga,et al.  Experimental Results in Function Optimization with EDAs in Continuous Domain , 2002, Estimation of Distribution Algorithms.

[29]  David E. Goldberg,et al.  Genetic Algorithms in Search Optimization and Machine Learning , 1988 .

[30]  S. Kullback,et al.  Information Theory and Statistics , 1959 .

[31]  M. Pelikán,et al.  The Bivariate Marginal Distribution Algorithm , 1999 .

[32]  Kalyanmoy Deb,et al.  RapidAccurate Optimization of Difficult Problems Using Fast Messy Genetic Algorithms , 1993, ICGA.

[33]  Pedro Larrañaga,et al.  Benefits of Data Clustering in Multimodal Function Optimization via EDAs , 2002, Estimation of Distribution Algorithms.

[34]  P. Bosman,et al.  Continuous iterated density estimation evolutionary algorithms within the IDEA framework , 2000 .

[35]  Rich Caruana,et al.  Removing the Genetics from the Standard Genetic Algorithm , 1995, ICML.

[36]  R. Santana,et al.  The mixture of trees Factorized Distribution Algorithm , 2001 .

[37]  J. Pearl Causality: Models, Reasoning and Inference , 2000 .

[38]  Kalyanmoy Deb,et al.  Constrained Test Problems for Multi-objective Evolutionary Optimization , 2001, EMO.

[39]  D. Goldberg,et al.  Escaping hierarchical traps with competent genetic algorithms , 2001 .

[41]  Kalyanmoy Deb,et al.  A fast and elitist multiobjective genetic algorithm: NSGA-II , 2002, IEEE Trans. Evol. Comput..

[42]  Peter J. Fleming,et al.  An Overview of Evolutionary Algorithms in Multiobjective Optimization , 1995, Evolutionary Computation.

[43]  John A. Hartigan,et al.  Clustering Algorithms , 1975 .

[45]  Dirk Thierens,et al.  Mixing in Genetic Algorithms , 1993, ICGA.

[46]  Dirk Thierens,et al.  Multi-objective mixture-based iterated density estimation evolutionary algorithms , 2001 .

[47]  Nir Friedman,et al.  Learning Bayesian Networks with Local Structure , 1996, UAI.

[48]  Gang Wang,et al.  Revisiting the GEMGA: scalable evolutionary optimization through linkage learning , 1998, 1998 IEEE International Conference on Evolutionary Computation Proceedings. IEEE World Congress on Computational Intelligence (Cat. No.98TH8360).

[49]  Kalyanmoy Deb,et al.  Messy Genetic Algorithms: Motivation, Analysis, and First Results , 1989, Complex Syst..

[50]  Solomon Kullback,et al.  Information Theory and Statistics , 1970, The Mathematical Gazette.

[51]  Dirk Thierens,et al.  Expanding from Discrete to Continuous Estimation of Distribution Algorithms: The IDEA , 2000, PPSN.

[52]  Thomas Bäck,et al.  Parallel Problem Solving from Nature — PPSN V , 1998, Lecture Notes in Computer Science.

[53]  Paul A. Viola,et al.  MIMIC: Finding Optima by Estimating Probability Densities , 1996, NIPS.

[54]  J. A. Lozano,et al.  Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation , 2001 .

[55]  Wray L. Buntine Operations for Learning with Graphical Models , 1994, J. Artif. Intell. Res..

[56]  G. Harik Linkage Learning via Probabilistic Modeling in the ECGA , 1999 .