Enhancing Selection Hyper-Heuristics via Feature Transformations

Hyper-heuristics are a novel tool. They deal with complex optimization problems where standalone solvers exhibit varied performance. Among such a tool reside selection hyper-heuristics. By combining the strengths of each solver, this kind of hyper-heuristic offers a more robust tool. However, their effectiveness is highly dependent on the 'features' used to link them with the problem that is being solved. Aiming at enhancing selection hyper-heuristics, in this paper we propose two types of transformation: explicit and implicit. The first one directly changes the distribution of critical points within the feature domain while using a Euclidean distance to measure proximity. The second one operates indirectly by preserving the distribution of critical points but changing the distance metric through a kernel function. We focus on analyzing the effect of each kind of transformation, and of their combinations. We test our ideas in the domain of constraint satisfaction problems because of their popularity and many practical applications. In this work, we compare the performance of our proposals against those of previously published data. Furthermore, we expand on previous research by increasing the number of analyzed features. We found that, by incorporating transformations into the model of selection hyper-heuristics, overall performance can be improved, yielding more stable results. However, combining implicit and explicit transformations was not as fruitful. Additionally, we ran some confirmatory tests on the domain of knapsack problems. Again, we observed improved stability, leading to the generation of hyper-heuristics whose profit had a standard deviation between 20% and 30% smaller.

[1]  Javier G. Marín-Blázquez,et al.  Multi-step environment learning classifier systems applied to hyper-heuristics , 2006, GECCO '06.

[2]  Francisco Herrera,et al.  Data Preprocessing in Data Mining , 2014, Intelligent Systems Reference Library.

[3]  David Pisinger,et al.  Where are the hard knapsack problems? , 2005, Comput. Oper. Res..

[4]  Xin Yao,et al.  A Survey on Evolutionary Computation Approaches to Feature Selection , 2016, IEEE Transactions on Evolutionary Computation.

[5]  Carlos A. Coello Coello,et al.  Improving hyper-heuristic performance through feature transformation , 2017, 2017 IEEE Congress on Evolutionary Computation (CEC).

[6]  A. Atiya,et al.  Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond , 2005, IEEE Transactions on Neural Networks.

[7]  Mengjie Zhang,et al.  Evolutionary computation for feature manipulation: Key challenges and future directions , 2016, 2016 IEEE Congress on Evolutionary Computation (CEC).

[8]  Montazeri Mitra,et al.  HHFS: Hyper-heuristic feature selection , 2016, Intelligent Data Analysis.

[9]  J. Bezdek,et al.  VAT: a tool for visual assessment of (cluster) tendency , 2002, Proceedings of the 2002 International Joint Conference on Neural Networks. IJCNN'02 (Cat. No.02CH37290).

[10]  Hugo Terashima-Marín,et al.  Using learning classifier systems to design selective hyper-heuristics for constraint satisfaction problems , 2013, 2013 IEEE Congress on Evolutionary Computation.

[11]  Gavin C. Cawley,et al.  On the Use of Default Parameter Settings in the Empirical Evaluation of Classification Algorithms , 2017, ArXiv.

[12]  Hugo Terashima-Marín,et al.  Learning vector quantization for variable ordering in constraint satisfaction problems , 2013, Pattern Recognit. Lett..

[13]  Eoin O'Mahony,et al.  Using Case-based Reasoning in an Algorithm Portfolio for Constraint Solving ? , 2008 .

[14]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[15]  Kate Smith-Miles,et al.  Measuring instance difficulty for combinatorial optimization problems , 2012, Comput. Oper. Res..

[16]  Toby Walsh,et al.  The Constrainedness of Search , 1996, AAAI/IAAI, Vol. 1.

[17]  Dorian Pyle,et al.  Data Preparation for Data Mining , 1999 .

[18]  Emma Hart,et al.  A hybrid method for feature construction and selection to improve wind-damage prediction in the forestry sector , 2017, GECCO.

[19]  Lakhdar Sais,et al.  Boosting Systematic Search by Weighting Constraints , 2004, ECAI.

[20]  Hugo Terashima-Marín,et al.  Combine and conquer: an evolutionary hyper-heuristic approach for solving constraint satisfaction problems , 2016, Artificial Intelligence Review.

[21]  David H. Wolpert,et al.  No free lunch theorems for optimization , 1997, IEEE Trans. Evol. Comput..

[22]  A. B. Petrochenkov,et al.  Structural synthesis of complex electrotechnical equipment on the basis of the constraint satisfaction method , 2015 .

[23]  John R. Rice,et al.  The Algorithm Selection Problem , 1976, Adv. Comput..

[24]  Yuri Malitsky,et al.  Instance-Specific Algorithm Configuration as a Method for Non-Model-Based Portfolio Generation , 2012, CPAIOR.

[25]  Michel Gendreau,et al.  Hyper-heuristics: a survey of the state of the art , 2013, J. Oper. Res. Soc..

[26]  Mark S. Fox,et al.  Learning and using hyper-heuristics for variable and value ordering in constraint satisfaction problems , 2009, GECCO '09.

[27]  James M. McCollum,et al.  A constraint satisfaction algorithm for microcontroller selection and pin assignment , 2010, Proceedings of the IEEE SoutheastCon 2010 (SoutheastCon).

[28]  Aurora Trinidad Ramirez Pozo,et al.  Hyper-Heuristic Based Product Selection for Software Product Line Testing , 2017, IEEE Computational Intelligence Magazine.

[29]  Isabelle Guyon,et al.  An Introduction to Variable and Feature Selection , 2003, J. Mach. Learn. Res..