A Survey of Automatic Parameter Tuning Methods for Metaheuristics

Parameter tuning, that is, to find appropriate parameter settings (or configurations) of algorithms so that their performance is optimized, is an important task in the development and application of metaheuristics. Automating this task, i.e., developing algorithmic procedure to address parameter tuning task, is highly desired and has attracted significant attention from the researchers and practitioners. During last two decades, many automatic parameter tuning approaches have been proposed. This paper presents a comprehensive survey of automatic parameter tuning methods for metaheuristics. A new classification (or taxonomy) of automatic parameter tuning methods is introduced according to the structure of tuning methods. The existing automatic parameter tuning approaches are consequently classified into three categories: 1) simple generate-evaluate methods; 2) iterative generate-evaluate methods; and 3) high-level generate-evaluate methods. Then, these three categories of tuning methods are reviewed in sequence. In addition to the description of each tuning method, its main strengths and weaknesses are discussed, which is helpful for new researchers or practitioners to select appropriate tuning methods to use. Furthermore, some challenges and directions of this field are pointed out for further research.

[1]  Francisco Herrera,et al.  Editorial scalability of evolutionary algorithms and other metaheuristics for large-scale continuous optimization problems , 2011, Soft Comput..

[2]  Leslie Pérez Cáceres,et al.  The irace package: Iterated racing for automatic algorithm configuration , 2016 .

[3]  Thomas J. Santner,et al.  Space-Filling Designs for Computer Experiments , 2003 .

[4]  J. Rejeb,et al.  New gender genetic algorithm for solving graph partitioning problems , 2000, Proceedings of the 43rd IEEE Midwest Symposium on Circuits and Systems (Cat.No.CH37144).

[5]  Charles Audet,et al.  Mesh Adaptive Direct Search Algorithms for Constrained Optimization , 2006, SIAM J. Optim..

[6]  Helena Ramalhinho Dias Lourenço,et al.  Iterated Local Search , 2001, Handbook of Metaheuristics.

[7]  Thomas Bartz-Beielstein,et al.  Design and Analysis of Optimization Algorithms Using Computational Statistics , 2004 .

[8]  M. E. H. Pedersen,et al.  Tuning & simplifying heuristical optimization , 2010 .

[9]  Gregory Gutin,et al.  The traveling salesman problem , 2006, Discret. Optim..

[10]  Pierre Hansen,et al.  Variable neighborhood search: Principles and applications , 1998, Eur. J. Oper. Res..

[11]  Kevin P. Murphy,et al.  Time-Bounded Sequential Parameter Optimization , 2010, LION.

[12]  Thomas Stützle,et al.  Performance evaluation of automatically tuned continuous optimizers on different benchmark sets , 2015, Appl. Soft Comput..

[13]  Georgios C. Anagnostopoulos,et al.  Multi-Objective Model Selection via Racing , 2016, IEEE Transactions on Cybernetics.

[14]  David S. Johnson,et al.  A theoretician's guide to the experimental analysis of algorithms , 1999, Data Structures, Near Neighbor Searches, and Methodology.

[15]  Thomas Stützle,et al.  Computational results for an automatically tuned CMA-ES with increasing population size on the CEC’05 benchmark set , 2012, Soft Computing.

[16]  Nikolaus Hansen,et al.  A restart CMA evolution strategy with increasing population size , 2005, 2005 IEEE Congress on Evolutionary Computation.

[17]  Holger H. Hoos,et al.  Automated Algorithm Configuration and Parameter Tuning , 2012, Autonomous Search.

[18]  Georgios C. Anagnostopoulos,et al.  S-Race: a multi-objective racing algorithm , 2013, GECCO '13.

[19]  E S Skakov,et al.  Parameter meta-optimization of metaheuristics of solving specific NP-hard facility location problem , 2018 .

[20]  Thomas Stützle,et al.  Automatic Algorithm Configuration Based on Local Search , 2007, AAAI.

[21]  Bertrand Neveu,et al.  A beginner's guide to tuning methods , 2014, Appl. Soft Comput..

[22]  Heike Trautmann,et al.  MO-ParamILS: A Multi-objective Automatic Algorithm Configuration Framework , 2016, LION.

[23]  Fred W. Glover,et al.  Future paths for integer programming and links to artificial intelligence , 1986, Comput. Oper. Res..

[24]  Edson Luiz França Senne,et al.  Improving the Fine-Tuning of Metaheuristics: An Approach Combining Design of Experiments and Racing Algorithms , 2017 .

[25]  Zbigniew Michalewicz,et al.  Parameter Control in Evolutionary Algorithms , 2007, Parameter Setting in Evolutionary Algorithms.

[26]  Thomas Bartz-Beielstein,et al.  Sequential Model-Based Parameter Optimization: an Experimental Investigation of Automated and Interactive Approaches , 2010, Experimental Methods for the Analysis of Optimization Algorithms.

[27]  Patrick Siarry,et al.  A survey on optimization metaheuristics , 2013, Inf. Sci..

[28]  Thomas Stützle,et al.  AClib: A Benchmark Library for Algorithm Configuration , 2014, LION.

[29]  El-Ghazali Talbi Common Concepts for Metaheuristics , 2009 .

[30]  Dennis Weyland,et al.  Simulated annealing, its parameter settings and the longest common subsequence problem , 2008, GECCO '08.

[31]  A. E. Eiben,et al.  Using Entropy for Parameter Analysis of Evolutionary Algorithms , 2010, Experimental Methods for the Analysis of Optimization Algorithms.

[32]  Prasanna Balaprakash,et al.  The ACO/F-Race Algorithm for Combinatorial Optimization Under Uncertainty , 2007, Metaheuristics.

[33]  Kevin P. Murphy,et al.  An experimental investigation of model-based parameter optimisation: SPO and beyond , 2009, GECCO.

[34]  Pierluigi Crescenzi,et al.  Introduction to the theory of complexity , 1994, Prentice Hall international series in computer science.

[35]  M. Powell The BOBYQA algorithm for bound constrained optimization without derivatives , 2009 .

[36]  David H. Wolpert,et al.  No free lunch theorems for optimization , 1997, IEEE Trans. Evol. Comput..

[37]  Xin Yao,et al.  Diversity-Driven Selection of Multiple Crossover Operators for the Capacitated Arc Routing Problem , 2014, EvoCOP.

[38]  John J. Grefenstette,et al.  Optimization of Control Parameters for Genetic Algorithms , 1986, IEEE Transactions on Systems, Man, and Cybernetics.

[39]  Thomas Stützle,et al.  Estimation-Based Local Search for Stochastic Combinatorial Optimization Using Delta Evaluations: A Case Study on the Probabilistic Traveling Salesman Problem , 2008, INFORMS J. Comput..

[40]  Georgios C. Anagnostopoulos,et al.  SPRINT Multi-Objective Model Racing , 2015, GECCO.

[41]  Thomas Bartz-Beielstein,et al.  In a Nutshell: Sequential Parameter Optimization , 2017, ArXiv.

[42]  Leslie Pérez Cáceres,et al.  Exploring variable neighborhood search for automatic algorithm configuration , 2017, Electron. Notes Discret. Math..

[43]  Manuel López-Ibáñez,et al.  Ant colony optimization , 2010, GECCO '10.

[44]  Lindawati,et al.  Fine-Tuning Algorithm Parameters Using the Design of Experiments Approach , 2011, LION.

[45]  David J. C. MacKay,et al.  Information Theory, Inference, and Learning Algorithms , 2004, IEEE Transactions on Information Theory.

[46]  A. E. Eiben,et al.  Parameter Tuning of Evolutionary Algorithms: Generalist vs. Specialist , 2010, EvoApplications.

[47]  Felix Dobslaw Recent Development in Automatic Parameter Tuning for Metaheuristics , 2010 .

[48]  Robert E. Mercer,et al.  ADAPTIVE SEARCH USING A REPRODUCTIVE META‐PLAN , 1978 .

[49]  Mark Hoogendoorn,et al.  Parameter Control in Evolutionary Algorithms: Trends and Challenges , 2015, IEEE Transactions on Evolutionary Computation.

[50]  Donald R. Jones,et al.  Efficient Global Optimization of Expensive Black-Box Functions , 1998, J. Glob. Optim..

[51]  Angela M. Dean,et al.  Principles and Techniques , 2017 .

[52]  Carlos Ansótegui,et al.  A Gender-Based Genetic Algorithm for the Automatic Configuration of Algorithms , 2009, CP.

[53]  Eugene L. Lawler,et al.  Traveling Salesman Problem , 2016 .

[54]  Thomas J. Santner,et al.  The Design and Analysis of Computer Experiments , 2003, Springer Series in Statistics.

[55]  John E. Beasley,et al.  OR-Library: Distributing Test Problems by Electronic Mail , 1990 .

[56]  Manuel Laguna,et al.  Fine-Tuning of Algorithms Using Fractional Experimental Designs and Local Search , 2006, Oper. Res..

[57]  A. E. Eiben,et al.  Introduction to Evolutionary Computing , 2003, Natural Computing Series.

[58]  Thomas Stützle,et al.  A Racing Algorithm for Configuring Metaheuristics , 2002, GECCO.

[59]  Edson Luiz França Senne,et al.  A Heuristic for Optimization of Metaheuristics by Means of Statistical Methods , 2017, ICORES.

[60]  Hans-Georg Beyer,et al.  Self-Adaptation in Evolutionary Algorithms , 2007, Parameter Setting in Evolutionary Algorithms.

[61]  C. D. Gelatt,et al.  Optimization by Simulated Annealing , 1983, Science.

[62]  Sonja Kuhnt,et al.  Design and analysis of computer experiments , 2010 .

[63]  A. E. Eiben,et al.  Parameter tuning for configuring and analyzing evolutionary algorithms , 2011, Swarm Evol. Comput..

[64]  J. A. Lozano,et al.  Estimation of Distribution Algorithms: A New Tool for Evolutionary Computation , 2001 .

[65]  O. Nelles,et al.  An Introduction to Optimization , 1996, IEEE Antennas and Propagation Magazine.

[66]  R. Roy A Primer on the Taguchi Method , 1990 .

[67]  Thomas Stützle,et al.  Improvement Strategies for the F-Race Algorithm: Sampling Design and Iterative Refinement , 2007, Hybrid Metaheuristics.

[68]  Panos M. Pardalos,et al.  Quadratic Assignment Problem , 1997, Encyclopedia of Optimization.

[69]  John N. Hooker,et al.  Testing heuristics: We have it all wrong , 1995, J. Heuristics.

[70]  Thomas Bartz-Beielstein,et al.  Sequential parameter optimization , 2005, 2005 IEEE Congress on Evolutionary Computation.

[71]  Frank Hutter,et al.  Automated configuration of algorithms for solving hard computational problems , 2009 .

[72]  Nikolaus Hansen,et al.  The CMA Evolution Strategy: A Comparing Review , 2006, Towards a New Evolutionary Computation.

[73]  Thomas Stützle,et al.  F-Race and Iterated F-Race: An Overview , 2010, Experimental Methods for the Analysis of Optimization Algorithms.

[74]  A. E. Eiben,et al.  Evolutionary Algorithm Parameters and Methods to Tune Them , 2012, Autonomous Search.

[75]  Andrew W. Moore,et al.  Hoeffding Races: Accelerating Model Selection Search for Classification and Function Approximation , 1993, NIPS.

[76]  Xin Yao,et al.  An Evolutionary Hyper-heuristic for the Software Project Scheduling Problem , 2016, PPSN.

[77]  Mauro Birattari,et al.  Tuning Metaheuristics - A Machine Learning Perspective , 2009, Studies in Computational Intelligence.

[78]  Margaret J. Robertson,et al.  Design and Analysis of Experiments , 2006, Handbook of statistics.

[79]  Andrew W. Moore,et al.  The Racing Algorithm: Model Selection for Lazy Learners , 1997, Artificial Intelligence Review.

[80]  Hoong Chuin Lau,et al.  Real-World Parameter Tuning using Factorial Design with Parameter Decomposition , 2013 .

[81]  A. E. Eiben,et al.  A method for parameter calibration and relevance estimation in evolutionary algorithms , 2006, GECCO '06.

[82]  Jing J. Liang,et al.  Problem Definitions and Evaluation Criteria for the CEC 2005 Special Session on Real-Parameter Optimization , 2005 .

[83]  A. Eiben,et al.  A multi-sexual genetic algorithm for multiobjective optimization , 1997, Proceedings of 1997 IEEE International Conference on Evolutionary Computation (ICEC '97).

[84]  Thomas Stützle,et al.  An analysis of post-selection in automatic configuration , 2013, GECCO '13.

[85]  Thomas Bartz-Beielstein,et al.  Model-based methods for continuous and discrete global optimization , 2017, Appl. Soft Comput..

[86]  Kevin Leyton-Brown,et al.  Sequential Model-Based Optimization for General Algorithm Configuration , 2011, LION.

[87]  A. E. Eiben,et al.  Efficient relevance estimation and value calibration of evolutionary algorithm parameters , 2007, 2007 IEEE Congress on Evolutionary Computation.

[88]  Yuri Malitsky,et al.  Instance-specific algorithm configuration , 2014, Constraints.

[89]  N. Zheng,et al.  Global Optimization of Stochastic Black-Box Systems via Sequential Kriging Meta-Models , 2006, J. Glob. Optim..

[90]  Laura Calvet,et al.  A statistical learning based approach for parameter fine-tuning of metaheuristics , 2016 .

[91]  Xin Yao,et al.  Evolutionary programming made faster , 1999, IEEE Trans. Evol. Comput..

[92]  Riccardo Poli,et al.  Particle swarm optimization , 1995, Swarm Intelligence.

[93]  Thomas Bartz-Beielstein,et al.  SPOT: An R Package For Automatic and Interactive Tuning of Optimization Algorithms by Sequential Parameter Optimization , 2010, ArXiv.

[94]  Eric R. Ziegel,et al.  The Elements of Statistical Learning , 2003, Technometrics.

[95]  Xin Yao,et al.  Dynamic selection of evolutionary operators based on online learning and fitness landscape analysis , 2016, Soft Comput..

[96]  Thomas Stützle,et al.  MAX-MIN Ant System , 2000, Future Gener. Comput. Syst..

[97]  Thomas Stützle,et al.  Continuous optimization algorithms for tuning real and integer parameters of swarm intelligence algorithms , 2011, Swarm Intelligence.

[98]  A. E. Eiben,et al.  Comparing parameter tuning methods for evolutionary algorithms , 2009, 2009 IEEE Congress on Evolutionary Computation.

[99]  Leo Breiman,et al.  Random Forests , 2001, Machine Learning.