Analyzing probabilistic models in hierarchical BOA on traps and spin glasses

The hierarchical Bayesian optimization algorithm (hBOA) can solve nearly decomposable and hierarchical problems of bounded difficulty in a robust and scalable manner by building and sampling probabilistic models of promising solutions. This paper analyzes probabilistic models in hBOA on two common test problems: concatenated traps and 2D Ising spin glasses with periodic boundary conditions. We argue that although Bayesian networks with local structures can encode complex probability distributions, analyzing these models in hBOA is relatively straightforward and the results of such analyses may provide practitioners with useful information about their problems. The results show that the probabilistic models in hBOA closely correspond to the structure of the underlying optimization problem, the models do not change significantly in subsequent iterations of BOA, and creating adequate probabilistic models by hand is not straightforward even with complete knowledge of the optimization problem.

[1]  D. Goldberg,et al.  A matrix approach for finding extrema: problems with modularity, hierarchy, and overlap , 2006 .

[2]  M. Mézard,et al.  Spin Glass Theory and Beyond , 1987 .

[3]  F. Guerra Spin Glasses , 2005, cond-mat/0507581.

[4]  Martin Pelikan,et al.  Fitness Inheritance in the Bayesian Optimization Algorithm , 2004, GECCO.

[5]  Martin V. Butz,et al.  Substructural Neighborhoods for Local Search in the Bayesian Optimization Algorithm , 2006, PPSN.

[6]  Judea Pearl,et al.  Probabilistic reasoning in intelligent systems - networks of plausible inference , 1991, Morgan Kaufmann series in representation and reasoning.

[7]  David E. Goldberg,et al.  Efficiency Enhancement of Estimation of Distribution Algorithms , 2006, Scalable Optimization via Probabilistic Modeling.

[8]  David E. Goldberg,et al.  Designing Competent Mutation Operators Via Probabilistic Model Building of Neighborhoods , 2004, GECCO.

[9]  Herbert A. Simon,et al.  The Sciences of the Artificial , 1970 .

[10]  Georges R. Harik,et al.  Finding Multimodal Solutions Using Restricted Tournament Selection , 1995, ICGA.

[11]  David E. Goldberg,et al.  Scalability of the Bayesian optimization algorithm , 2002, Int. J. Approx. Reason..

[12]  K. Binder,et al.  Spin glasses: Experimental facts, theoretical concepts, and open questions , 1986 .

[13]  Hao Wu,et al.  Does overfitting affect performance in estimation of distribution algorithms , 2006, GECCO.

[14]  David E. Goldberg,et al.  A Survey of Optimization by Building and Using Probabilistic Models , 2002, Comput. Optim. Appl..

[15]  P. Bosman,et al.  Continuous iterated density estimation evolutionary algorithms within the IDEA framework , 2000 .

[16]  Ronald A. Howard,et al.  Readings on the Principles and Applications of Decision Analysis , 1989 .

[17]  J. A. Lozano,et al.  Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation , 2001 .

[18]  Kalyanmoy Deb,et al.  Analyzing Deception in Trap Functions , 1992, FOGA.

[19]  Martin V. Butz,et al.  Performance of Evolutionary Algorithms on Random Decomposable Problems , 2006, PPSN.

[20]  D. Goldberg,et al.  BOA: the Bayesian optimization algorithm , 1999 .

[21]  Martin Pelikan,et al.  Hierarchical Bayesian optimization algorithm: toward a new generation of evolutionary algorithms , 2010, SICE 2003 Annual Conference (IEEE Cat. No.03TH8734).

[22]  David E. Goldberg,et al.  Hierarchical BOA Solves Ising Spin Glasses and MAXSAT , 2003, GECCO.

[23]  D. Goldberg,et al.  Escaping hierarchical traps with competent genetic algorithms , 2001 .

[24]  David E. Goldberg,et al.  Sporadic model building for efficiency enhancement of hierarchical BOA , 2006, GECCO.

[25]  Martin Pelikan,et al.  Searching for Ground States of Ising Spin Glasses with Hierarchical BOA and Cluster Exact Approximation , 2006, Scalable Optimization via Probabilistic Modeling.

[26]  Martin Pelikan,et al.  Scalable Optimization via Probabilistic Modeling , 2006, Studies in Computational Intelligence.

[27]  M. Troyer,et al.  Performance limitations of flat-histogram methods. , 2003, Physical review letters.

[28]  Heinz Mühlenbein,et al.  Schemata, Distributions and Graphical Models in Evolutionary Optimization , 1999, J. Heuristics.

[29]  David E. Goldberg,et al.  Hierarchical Bayesian Optimization Algorithm , 2006, Scalable Optimization via Probabilistic Modeling.

[30]  F. Barahona On the computational complexity of Ising spin glass models , 1982 .

[31]  Shumeet Baluja,et al.  A Method for Integrating Genetic Search Based Function Optimization and Competitive Learning , 1994 .

[32]  David E. Goldberg,et al.  The Design of Innovation: Lessons from and for Competent Genetic Algorithms , 2002 .

[33]  Ronald A. Howard,et al.  Influence Diagrams , 2005, Decis. Anal..

[34]  D. Goldberg,et al.  Don't evaluate, inherit , 2001 .

[35]  H. Mühlenbein,et al.  From Recombination of Genes to the Estimation of Distributions I. Binary Parameters , 1996, PPSN.

[36]  Dirk Thierens,et al.  Mixing in Genetic Algorithms , 1993, ICGA.

[37]  David Maxwell Chickering,et al.  A Bayesian Approach to Learning Bayesian Networks with Local Structure , 1997, UAI.

[38]  Nir Friedman,et al.  Learning Bayesian Networks with Local Structure , 1996, UAI.

[39]  David E. Goldberg,et al.  A hierarchy machine: Learning to optimize from nature and humans , 2003, Complex..

[40]  Franz Rothlauf,et al.  Evaluation-Relaxation Schemes for Genetic and Evolutionary Algorithms , 2004 .

[41]  David H. Ackley,et al.  An empirical study of bit vector function optimization , 1987 .