We propose to apply the Boltzmann machine (BM) to population-based incremental learning (PBIL). We will replace the statistical model used in PBIL, which assumes that the binary variables of the optimisation problem are independent, with that of a BM. From the logarithm of the expectation of the function to maximise, we derive specific learning rules for the BM. These learning rules involve expectations with respect to the distribution of the BM and also to its selected version, in the spirit of PBIL or genetic algorithms. New populations are sampled from the BM using traditional Gibbs sampling. The proposed BM-PBIL algorithm alternates Gibbs sampling, selection, and update of the parameters. We evaluate BM-PBIL with different classes of functions, compare it to the original PBIL, and identify classes for which it is superior to PBIL, in particular functions with jumps and quadratic functions.
Rich Caruana,et al.
Removing the Genetics from the Standard Genetic Algorithm
Byoung-Tak Zhang,et al.
Bayesian Evolutionary Optimization Using Helmholtz Machines
Stuart A. Kauffman,et al.
Origins of Order: self-organization and selection in evolution
Jing Peng,et al.
Function Optimization using Connectionist Reinforcement Learning Algorithms
The origins of order. Self-organization and selection in evolution
S. Baluja,et al.
Using Optimal Dependency-Trees for Combinatorial Optimization: Learning the Structure of the Search Space
David H. Wolpert,et al.
No free lunch theorems for optimization
IEEE Trans. Evol. Comput..
M. Sebag,et al.
A society of hill-climbers
Proceedings of 1997 IEEE International Conference on Evolutionary Computation (ICEC '97).