Are Multiple Runs of Genetic Algorithms Better than One?

There are conflicting reports over whether multiple independent runs of genetic algorithms (GAs) with small populations can reach solutions of higher quality or can find acceptable solutions faster than a single run with a large population. This paper investigates this question analytically using two approaches. First, the analysis assumes that there is a certain fixed amount of computational resources available, and identifies the conditions under which it is advantageous to use multiple small runs. The second approach does not constrain the total cost and examines whether multiple properly-sized independent runs can reach the optimal solution faster than a single run. Although this paper is limited to additively-separable functions, it may be applicable to the larger class of nearly decomposable functions of interest to many GA users. The results suggest that, in most cases under the constant cost constraint, a single run with the largest population possible reaches a better solution than multiple independent runs. Similarly, a single large run reaches the global faster than multiple small runs. The findings are validated with experiments on functions of varying difficulty.

[1]  D. Farnsworth A First Course in Order Statistics , 1993 .

[2]  David H. Wolpert,et al.  No free lunch theorems for optimization , 1997, IEEE Trans. Evol. Comput..

[3]  E. Cantu-Paz,et al.  The Gambler's Ruin Problem, Genetic Algorithms, and the Sizing of Populations , 1997, Evolutionary Computation.

[4]  Feller William,et al.  An Introduction To Probability Theory And Its Applications , 1950 .

[5]  Xin Yao,et al.  Parallel Problem Solving from Nature PPSN VI , 2000, Lecture Notes in Computer Science.

[6]  Marco Tomassini,et al.  Experimental study of isolated multipopulation genetic programming , 2000, 2000 26th Annual Conference of the IEEE Industrial Electronics Society. IECON 2000. 2000 IEEE International Conference on Industrial Electronics, Control and Instrumentation. 21st Century Technologies.

[7]  Erick Cantú-Paz,et al.  Modeling Idealized Bounding Cases of Parallel Genetic Algorithms , 1996 .

[8]  Dirk Thierens,et al.  Scalability and Efficiency of Genetic Algorithms for Geometrical Applications , 2000, PPSN.

[9]  B. Arnold,et al.  A first course in order statistics , 1994 .

[10]  S. Luke When short runs beat long runs , 2001 .

[11]  Reiko Tanese,et al.  Distributed Genetic Algorithms , 1989, ICGA.

[12]  Heinz Mühlenbein,et al.  Predictive Models for the Breeder Genetic Algorithm I. Continuous Parameter Optimization , 1993, Evolutionary Computation.

[13]  Erick Cantú-Paz,et al.  Efficient and Accurate Parallel Genetic Algorithms , 2000, Genetic Algorithms and Evolutionary Computation.

[14]  John H. Holland,et al.  When will a Genetic Algorithm Outperform Hill Climbing , 1993, NIPS.

[15]  David E. Goldberg,et al.  Genetic Algorithms, Selection Schemes, and the Varying Effects of Noise , 1996, Evolutionary Computation.

[16]  Kalyanmoy Deb,et al.  Analyzing Deception in Trap Functions , 1992, FOGA.

[17]  Dan Boneh,et al.  Where Genetic Algorithms Excel , 2001, Evolutionary Computation.

[18]  Matthias Fuchs,et al.  Large Populations Are Not Always The Best Choice In Genetic Programming , 1999, GECCO.

[19]  Ron Shonkwiler,et al.  Parallel Genetic Algorithms , 1993, ICGA.

[20]  Zbigniew Michalewicz,et al.  Evolutionary Computation 1 , 2018 .

[21]  Kalyanmoy Deb,et al.  Genetic Algorithms, Noise, and the Sizing of Populations , 1992, Complex Syst..

[22]  David E. Goldberg,et al.  Genetic Algorithms in Search Optimization and Machine Learning , 1988 .

[23]  Takeshi Yamada,et al.  Optimal Population Size under Constant Computation Cost , 1994, PPSN.