Investigating the use of local search for improving meta-hyper-heuristic performance

This paper investigates the use of local search strategies to improve the performance of a meta-hyper-heuristic algorithm, a hyper-heuristic which employs one or more meta-heuristics as low-level heuristics. Alternative mechanisms for selecting the solutions to be refined further by means of local search, as well as the intensity of subsequent refinement in terms of number of allowable function evaluations, are investigated. Furthermore, defining a local search as one of the low-level heuristics versus applying the algorithm directly to the solution space is also investigated. Performance is evaluated on a diverse set of floating-point benchmark problems. The addition of local search was found to improve algorithm results significantly. Random selection of solutions for further refinement was identified as the best selection strategy and a higher intensity of refinement was identified as most desirable. Better results were obtained by applying the local search algorithm directly to the search space instead of defining it as a low-level heuristic.

[1]  Bart Selman,et al.  Algorithm portfolios , 2001, Artif. Intell..

[2]  R. Belew,et al.  Evolutionary algorithms with local search for combinatorial optimization , 1998 .

[3]  James Smith,et al.  A tutorial for competent memetic algorithms: model, taxonomy, and design issues , 2005, IEEE Transactions on Evolutionary Computation.

[4]  Michèle Sebag,et al.  Fitness-AUC bandit adaptive strategy selection vs. the probability matching one within differential evolution: an empirical comparison on the bbob-2010 noiseless testbed , 2010, GECCO '10.

[5]  L. Darrell Whitley,et al.  Lamarckian Evolution, The Baldwin Effect and Function Optimization , 1994, PPSN.

[6]  Álvaro Fialho,et al.  Adaptive strategy selection in differential evolution , 2010, GECCO '10.

[7]  Edmund K. Burke,et al.  Hybridizations within a graph-based hyper-heuristic framework for university timetabling problems , 2009, J. Oper. Res. Soc..

[8]  Andries P. Engelbrecht Heterogeneous Particle Swarm Optimization , 2010, ANTS Conference.

[9]  Fei Peng,et al.  Population-Based Algorithm Portfolios for Numerical Optimization , 2010, IEEE Transactions on Evolutionary Computation.

[10]  Rainer Storn,et al.  Differential Evolution – A Simple and Efficient Heuristic for global Optimization over Continuous Spaces , 1997, J. Glob. Optim..

[11]  Kay Chen Tan,et al.  A Multi-Facet Survey on Memetic Computation , 2011, IEEE Transactions on Evolutionary Computation.

[12]  Graham Kendall,et al.  A Tabu-Search Hyperheuristic for Timetabling and Rostering , 2003, J. Heuristics.

[13]  A. Engelbrecht,et al.  A new locally convergent particle swarm optimiser , 2002, IEEE International Conference on Systems, Man and Cybernetics.

[14]  Nikolaus Hansen,et al.  A restart CMA evolution strategy with increasing population size , 2005, 2005 IEEE Congress on Evolutionary Computation.

[15]  William E. Hart,et al.  Recent Advances in Memetic Algorithms , 2008 .

[16]  Bruce A. Robinson,et al.  Self-Adaptive Multimethod Search for Global Optimization in Real-Parameter Spaces , 2009, IEEE Transactions on Evolutionary Computation.

[17]  Andries Petrus Engelbrecht,et al.  Alternative hyper-heuristic strategies for multi-method global optimization , 2010, IEEE Congress on Evolutionary Computation.

[18]  Thomas G. Dietterich Multiple Classifier Systems , 2000, Lecture Notes in Computer Science.

[19]  Pablo Moscato,et al.  On Evolution, Search, Optimization, Genetic Algorithms and Martial Arts : Towards Memetic Algorithms , 1989 .

[20]  Dirk Sudholt Local Search in Evolutionary Algorithms: The Impact of the Local Search Frequency , 2006, ISAAC.

[21]  Michel Gendreau,et al.  Hyper-heuristics: a survey of the state of the art , 2013, J. Oper. Res. Soc..

[22]  Andries Petrus Engelbrecht,et al.  An analysis of heterogeneous cooperative algorithms , 2009, 2009 IEEE Congress on Evolutionary Computation.