A Bayesian framework for evolutionary computation

A Bayesian framework for evolutionary computation is presented. Given a data set for fitness evaluation the best (fittest) individual is defined as the most probable model of the data with respect to the prior knowledge on the problem domain. In each generation, Bayes theorem is used to estimate the posterior fitness of individuals from their prior fitness values. Offspring individuals are then generated by sampling from the posterior distribution combined with the transition probabilities formed by variation operators. The evolutionary inference steps from the prior via posterior distribution of parent fitness to the expected fitness distribution of offspring are essential elements in Bayesian evolutionary computation. One of the most interesting aspects of Bayesian evolution is that it provides principled techniques for controlling evolutionary dynamics. Specifically, we describe two examples of the application of the Bayesian framework. One is a Bayesian evolutionary algorithm (BEA) designed to evolve parsimonious individuals in evolutionary computation with variable-size representation. We show that the adaptive Occam method for program growth control is a special form of Bayesian evolution. The other example is an evolutionary algorithm with incremental data inheritance (IDI). In this BEA, the fitness of individuals is estimated on incrementally chosen data subsets, rather than on the whole data set, and thus the convergence is accelerated by reducing the effective number of fitness evaluations. Experimental results are provided to show the effectiveness of the BEAs.

[1]  J. Rissanen Stochastic Complexity and Modeling , 1986 .

[2]  Heinz Mühlenbein,et al.  The Science of Breeding and Its Application to the Breeder Genetic Algorithm (BGA) , 1993, Evolutionary Computation.

[3]  Byoung-Tak Zhang,et al.  Balancing Accuracy and Parsimony in Genetic Programming , 1995, Evolutionary Computation.

[4]  Butong Zhang,et al.  Enhancing Robustness of Genetic Programming at the Species Level , 1996 .

[5]  Peter Ross,et al.  Small Populations over Many Generations can beat Large Populations over Few Generations in Genetic P , 1997 .

[6]  David B. Fogel,et al.  Evolutionary algorithms in theory and practice , 1997, Complex.

[7]  Byoung-Tak Zhang,et al.  Evolutionary Induction of Sparse Neural Trees , 1997, Evolutionary Computation.

[8]  David B. Fogel,et al.  Evolutionary Computation: The Fossil Record , 1998 .

[9]  Xin Yao,et al.  Making use of population information in evolutionary artificial neural networks , 1998, IEEE Trans. Syst. Man Cybern. Part B.

[10]  Byoung-Tak Zhang,et al.  Genetic programming with incremental data inheritance , 1999 .

[11]  H. Muhlenbein,et al.  The Factorized Distribution Algorithm for additively decomposed functions , 1999, Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406).

[12]  Byoung-Tak Zhang,et al.  Time series prediction using committee machines of evolutionary neural trees , 1999, Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406).