Abstract : When trying to solve a combinatorial optimization problem, often multiple algorithms and/or multiple runs of the same algorithm are used in order to find multiple local minima. The information gained from previous search runs is commonly discarded when selecting initialization points for future runs. We present a method which uses information from previous runs to determine promising starting points for future searches. Our algorithm, termed COMIT, models inter-parameter dependencies present in the previously found high-evaluation solutions. COMIT incrementally learns optimal dependency trees that model the pairwise dependencies in a set of good solutions found in previous searches. COMIT then samples the probability distributions modeled by these trees to generate new starting points for future searches. This algorithm has been successfully applied to jobshop scheduling, traveling salesman, knapsack, rectangle packing, and bin-packing problems.
[1]
C. N. Liu,et al.
Approximating discrete probability distributions with dependence trees
,
1968,
IEEE Trans. Inf. Theory.
[2]
Gilbert Syswerda,et al.
Uniform Crossover in Genetic Algorithms
,
1989,
ICGA.
[3]
Fred W. Glover,et al.
Tabu Search - Part I
,
1989,
INFORMS J. Comput..
[4]
Fred Glover,et al.
Tabu Search - Part II
,
1989,
INFORMS J. Comput..
[5]
Paul A. Viola,et al.
MIMIC: Finding Optima by Estimating Probability Densities
,
1996,
NIPS.
[6]
Shumeet Baluja,et al.
Genetic Algorithms and Explicit Search Statistics
,
1996,
NIPS.
[7]
S. Baluja,et al.
Using Optimal Dependency-Trees for Combinatorial Optimization: Learning the Structure of the Search Space
,
1997
.