Why Advanced Population Initialization Techniques Perform Poorly in High Dimension?

Many advanced population initialization techniques for Evolutionary Algorithms EAs have hitherto been proposed. Several studies claimed that the techniques significantly improve EAs' performance. However, recent researches show that they cannot scale well to high dimensional spaces. This study investigates the reasons behind the failure of advanced population initialization techniques in large-scale problems by adopting a wide range of population sizes. To avoid being biased to any particular EA model or problem set, this study employs general purpose tools in the experiments. Our investigations show that, in spite of population size, uniformity of populations drops dramatically when dimensionality grows. The observation confirms that the uniformity loss exist in high dimensional spaces regardless of the type of EA, initializer or problem. Therefore, we conclude that the weak uniformity of the resulting population is the main cause of the poor performance of advanced initializers in high dimensions.

[1]  Ping Li,et al.  Sine-map chaotic PSO-based neural network predictive control for deployable space truss structures , 2013, 2013 IEEE International Symposium on Industrial Electronics.

[2]  J. Halton On the efficiency of certain quasi-random sequences of points in evaluating multi-dimensional integrals , 1960 .

[3]  Takuji Nishimura,et al.  Mersenne twister: a 623-dimensionally equidistributed uniform pseudo-random number generator , 1998, TOMC.

[4]  Xiaodong Li,et al.  A novel hybridization of opposition-based learning and cooperative co-evolutionary for large-scale optimization , 2014, 2014 IEEE Congress on Evolutionary Computation (CEC).

[5]  Xiaodong Li,et al.  Effects of population initialization on differential evolution for large scale optimization , 2014, 2014 IEEE Congress on Evolutionary Computation (CEC).

[6]  Lei Peng,et al.  A Novel Differential Evolution with Uniform Design for Continuous Global Optimization , 2012, J. Comput..

[7]  Yuping Wang,et al.  An orthogonal genetic algorithm with quantization for global numerical optimization , 2001, IEEE Trans. Evol. Comput..

[8]  Fred J. Hickernell,et al.  A generalized discrepancy and quadrature error bound , 1998, Math. Comput..

[9]  Paul Bratley,et al.  Algorithm 659: Implementing Sobol's quasirandom sequence generator , 1988, TOMS.

[10]  M. D. McKay,et al.  A comparison of three methods for selecting values of input variables in the analysis of output from a computer code , 2000 .

[11]  I. Sobol,et al.  On quasi-Monte Carlo integrations , 1998 .

[12]  Xiaodong Li,et al.  Initialization methods for large scale global optimization , 2013, 2013 IEEE Congress on Evolutionary Computation.

[13]  Lei Peng,et al.  UDE: Differential Evolution with Uniform Design , 2010, 2010 3rd International Symposium on Parallel Architectures, Algorithms and Programming.

[14]  S. K. Park,et al.  Random number generators: good ones are hard to find , 1988, CACM.

[15]  Dennis K. J. Lin,et al.  Ch. 4. Uniform experimental designs and their applications in industry , 2003 .

[16]  I. Sloan Lattice Methods for Multiple Integration , 1994 .

[17]  A. Kai Qin,et al.  A review of population initialization techniques for evolutionary algorithms , 2014, 2014 IEEE Congress on Evolutionary Computation (CEC).

[18]  Josef Stoer,et al.  Numerische Mathematik 2 , 1990 .