Using Previous Models to Bias Structural Learning in the Hierarchical BOA

Estimation of distribution algorithms (EDAs) are stochastic optimization techniques that explore the space of potential solutions by building and sampling probabilistic models of promising candidate solutions. While the primary goal of applying EDAs is to discover the global optimum (or an accurate approximation), any EDA also provides us with a sequence of probabilistic models, which hold a great deal of information about the problem. Although using problem-specific knowledge has been shown to significantly improve performance of EDAs and other evolutionary algorithms, this readily available source of information has been largely ignored by the EDA community. This paper takes the first step towards the use of probabilistic models obtained by EDAs to speed up the solution of similar problems in the future. More specifically, we propose two approaches to biasing model building in the hierarchical Bayesian optimization algorithm (hBOA) based on knowledge automatically learned from previous runs on similar problems. We show that the methods lead to substantial speedups and argue that they should work well in other applications that require solving a large number of problems with similar structure.

[1]  M. Troyer,et al.  Performance limitations of flat-histogram methods. , 2003, Physical review letters.

[2]  Josef Schwarz,et al.  The Parallel Bayesian Optimization Algorithm , 2000 .

[3]  Ronald A. Howard,et al.  Influence Diagrams , 2005, Decis. Anal..

[4]  Shumeet Baluja,et al.  Incorporating a priori Knowledge in Probabilistic-Model Based Optimization , 2006, Scalable Optimization via Probabilistic Modeling.

[5]  Heinz Mühlenbein,et al.  FDA -A Scalable Evolutionary Algorithm for the Optimization of Additively Decomposed Functions , 1999, Evolutionary Computation.

[6]  David E. Goldberg,et al.  Designing Competent Mutation Operators Via Probabilistic Model Building of Neighborhoods , 2004, GECCO.

[7]  David Maxwell Chickering,et al.  A Bayesian Approach to Learning Bayesian Networks with Local Structure , 1997, UAI.

[8]  D. Goldberg,et al.  Don't evaluate, inherit , 2001 .

[9]  Jiri Ocenasek,et al.  Parallel Estimation of Distribution Algorithms , 2010 .

[10]  Martin Pelikan,et al.  Design of Parallel Estimation of Distribution Algorithms , 2006, Scalable Optimization via Probabilistic Modeling.

[11]  Sebastian Thrun,et al.  Learning One More Thing , 1994, IJCAI.

[12]  David E. Goldberg,et al.  A Survey of Optimization by Building and Using Probabilistic Models , 2002, Comput. Optim. Appl..

[13]  A. K. Hartmann,et al.  Cluster-exact approximation of spin glass groundstates , 1995 .

[14]  H. Mühlenbein,et al.  From Recombination of Genes to the Estimation of Distributions I. Binary Parameters , 1996, PPSN.

[15]  David H. Ackley,et al.  An empirical study of bit vector function optimization , 1987 .

[16]  John A. W. McCall,et al.  Solving the MAXSAT problem using a multivariate EDA based on Markov networks , 2007, GECCO '07.

[17]  Toby Walsh,et al.  Morphing: Combining Structure and Randomness , 1999, AAAI/IAAI.

[18]  Nir Friedman,et al.  Learning Bayesian Networks with Local Structure , 1996, UAI.

[19]  Sebastian Thrun,et al.  Is Learning The n-th Thing Any Easier Than Learning The First? , 1995, NIPS.

[20]  Kumara Sastry,et al.  Efficient Atomic Cluster Optimization Using A Hybrid Extended Compact Genetic Algorithm With Seeded , 2001 .

[21]  Judea Pearl,et al.  Probabilistic reasoning in intelligent systems - networks of plausible inference , 1991, Morgan Kaufmann series in representation and reasoning.

[22]  David E. Goldberg,et al.  Efficient Genetic Algorithms Using Discretization Scheduling , 2005, Evolutionary Computation.

[23]  Gregory F. Cooper,et al.  A Bayesian method for the induction of probabilistic networks from data , 1992, Machine Learning.

[24]  K. Binder,et al.  Spin glasses: Experimental facts, theoretical concepts, and open questions , 1986 .

[25]  Kalyanmoy Deb,et al.  Messy Genetic Algorithms: Motivation, Analysis, and First Results , 1989, Complex Syst..

[26]  David E. Goldberg,et al.  Sporadic model building for efficiency enhancement of hierarchical BOA , 2006, GECCO '06.

[27]  David E. Goldberg,et al.  Scalability of the Bayesian optimization algorithm , 2002, Int. J. Approx. Reason..

[28]  Georges R. Harik,et al.  Finding Multimodal Solutions Using Restricted Tournament Selection , 1995, ICGA.

[29]  Martin V. Butz,et al.  Substructural Neighborhoods for Local Search in the Bayesian Optimization Algorithm , 2006, PPSN.

[30]  Neil D. Lawrence,et al.  Learning to learn with the informative vector machine , 2004, ICML.

[31]  D. Goldberg,et al.  BOA: the Bayesian optimization algorithm , 1999 .

[32]  Sebastian Thrun,et al.  Lifelong Learning: A Case Study. , 1995 .

[33]  David E. Goldberg,et al.  Hierarchical BOA Solves Ising Spin Glasses and MAXSAT , 2003, GECCO.

[34]  Heinz Mühlenbein,et al.  Evolutionary optimization and the estimation of search distributions with applications to graph bipartitioning , 2002, Int. J. Approx. Reason..

[35]  David E. Goldberg,et al.  Efficiency Enhancement of Estimation of Distribution Algorithms , 2006, Scalable Optimization via Probabilistic Modeling.

[36]  L. Darrell Whitley,et al.  Genetic Algorithm Behavior in the MAXSAT Domain , 1998, PPSN.

[37]  Franz Rothlauf,et al.  Representations for genetic and evolutionary algorithms , 2002, Studies in Fuzziness and Soft Computing.

[38]  P. Bosman,et al.  Continuous iterated density estimation evolutionary algorithms within the IDEA framework , 2000 .

[39]  Kalyanmoy Deb,et al.  Analyzing Deception in Trap Functions , 1992, FOGA.

[40]  Martin Pelikan,et al.  Scalable Optimization via Probabilistic Modeling: From Algorithms to Applications (Studies in Computational Intelligence) , 2006 .

[41]  David E. Goldberg,et al.  The Design of Innovation: Lessons from and for Competent Genetic Algorithms , 2002 .

[42]  Elena Marchiori,et al.  Evolutionary Algorithms for the Satisfiability Problem , 2002, Evolutionary Computation.

[43]  Alexander Mendiburu,et al.  Implementation and Performance Evaluation of a Parallelization of Estimation of Bayesian Network Algorithms , 2006, Parallel Process. Lett..

[44]  Martin Pelikan,et al.  Hierarchical Bayesian optimization algorithm: toward a new generation of evolutionary algorithms , 2010, SICE 2003 Annual Conference (IEEE Cat. No.03TH8734).

[45]  Thomas Stützle,et al.  SATLIB: An Online Resource for Research on SAT , 2000 .

[46]  Martin Pelikan,et al.  Analyzing probabilistic models in hierarchical BOA on traps and spin glasses , 2007, GECCO '07.

[47]  D. Goldberg,et al.  Escaping hierarchical traps with competent genetic algorithms , 2001 .

[48]  A. Young,et al.  Spin glasses and random fields , 1997 .

[49]  Habiba Drias,et al.  A Performance Comparison of Evolutionary Meta-heuristics and Solving MAX-SAT Problems , 2004, International Conference on Computational Intelligence.

[50]  Tony Jebara,et al.  Multi-task feature and kernel selection for SVMs , 2004, ICML.

[51]  J. A. Lozano,et al.  Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation , 2001 .

[52]  David E. Goldberg Using Time Efficiently: Genetic-Evolutionary Algorithms and the Continuation Problem , 1999, GECCO.

[53]  Xavier Llorà,et al.  Towards billion-bit optimization via a parallel estimation of distribution algorithm , 2007, GECCO '07.

[54]  Martin Pelikan,et al.  Searching for Ground States of Ising Spin Glasses with Hierarchical BOA and Cluster Exact Approximation , 2006, Scalable Optimization via Probabilistic Modeling.

[55]  Jack Mostow,et al.  Direct Transfer of Learned Information Among Neural Networks , 1991, AAAI.

[56]  Shumeet Baluja,et al.  A Method for Integrating Genetic Search Based Function Optimization and Competitive Learning , 1994 .

[57]  David E. Goldberg,et al.  A hierarchy machine: Learning to optimize from nature and humans , 2003, Complex..

[58]  Franz Rothlauf,et al.  Evaluation-Relaxation Schemes for Genetic and Evolutionary Algorithms , 2004 .

[59]  David E. Goldberg,et al.  Efficiency enhancement of genetic algorithms via building-block-wise fitness estimation , 2004, Proceedings of the 2004 Congress on Evolutionary Computation (IEEE Cat. No.04TH8753).

[60]  Rich Caruana,et al.  Multitask Learning , 1998, Encyclopedia of Machine Learning and Data Mining.

[61]  Ronald A. Howard,et al.  Readings on the Principles and Applications of Decision Analysis , 1989 .

[62]  Erick Cantú-Paz,et al.  Efficient and Accurate Parallel Genetic Algorithms , 2000, Genetic Algorithms and Evolutionary Computation.

[63]  M. Mézard,et al.  Spin Glass Theory and Beyond , 1987 .

[64]  F. Guerra Spin Glasses , 2005, cond-mat/0507581.

[65]  David E. Goldberg,et al.  Let's Get Ready to Rumble: Crossover Versus Mutation Head to Head , 2004, GECCO.

[66]  David E. Goldberg,et al.  Effects of a deterministic hill climber on hBOA , 2009, GECCO.

[67]  Martin Pelikan,et al.  Fitness Inheritance in the Bayesian Optimization Algorithm , 2004, GECCO.

[68]  G. Harik Linkage Learning via Probabilistic Modeling in the ECGA , 1999 .

[69]  Robert E. Smith,et al.  Fitness inheritance in genetic algorithms , 1995, SAC '95.