Applying automatic heuristic-filtering to improve hyper-heuristic performance

Hyper-heuristics have emerged as an important strategy for combining the strengths of different heuristics into a single method. Although hyper-heuristics have been found to be successful in many scenarios, little attention has been paid to the subsets of heuristics that these methods manage and apply. In several cases, heuristics can interfere with each other and can be harmful for the search. Thus, obtaining information about the differences among heuristics, and how they contribute to the search process is very important. The main contribution of this paper is an automatic heuristic-filtering process that allows hyper-heuristics to exclude heuristics that do not contribute to improving the solution. Based on some previous works in feature selection, two methods are proposed that rank heuristics and sequentially select only suitable heuristics in a hyper-heuristic framework. Our experiments over a set of Constraint Satisfaction Problem instances show that a hyper-heuristic with only selected heuristics obtains significantly better results than a hyper-heuristic containing all heuristics, in terms of running times. In addition, the success rate of solving such instances is better for the hyper-heuristic with the suitable heuristics than for the hyper-heuristic without our proposed filtering process.

[1]  Keinosuke Fukunaga,et al.  A Branch and Bound Algorithm for Feature Subset Selection , 1977, IEEE Transactions on Computers.

[2]  Christophe Lecoutre Constraint Networks , 1992 .

[3]  Roberto Battiti,et al.  Using mutual information for selecting features in supervised neural net learning , 1994, IEEE Trans. Neural Networks.

[4]  Josef Kittler,et al.  Floating search methods in feature selection , 1994, Pattern Recognit. Lett..

[5]  George Forman,et al.  An Extensive Empirical Study of Feature Selection Metrics for Text Classification , 2003, J. Mach. Learn. Res..

[6]  Isabelle Guyon,et al.  An Introduction to Variable and Feature Selection , 2003, J. Mach. Learn. Res..

[7]  Anil K. Jain,et al.  Simultaneous feature selection and clustering using mixture models , 2004, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[8]  F. Fleuret Fast Binary Feature Selection with Conditional Mutual Information , 2004, J. Mach. Learn. Res..

[9]  Richard J. Wallace Analysis of Heuristic Synergies , 2005, CSCLP.

[10]  David E. Goldberg,et al.  Genetic algorithms and Machine Learning , 1988, Machine Learning.

[11]  Susan L. Epstein,et al.  Random Subsets Support Learning a Mixture of Heuristics , 2008, Int. J. Artif. Intell. Tools.

[12]  Vineet Agarwal,et al.  Twenty‐five years of the Taffler z‐score model: Does it really have predictive ability? , 2007 .

[13]  Riccardo Poli,et al.  Particle swarm optimization , 1995, Swarm Intelligence.

[14]  Huan Liu,et al.  Semi-supervised Feature Selection via Spectral Analysis , 2007, SDM.

[15]  Sanja Petrovic,et al.  Dispatching rules for production scheduling: A hyper-heuristic landscape analysis , 2009, 2009 IEEE Congress on Evolutionary Computation.

[16]  James M. McCollum,et al.  A constraint satisfaction algorithm for microcontroller selection and pin assignment , 2010, Proceedings of the IEEE SoutheastCon 2010 (SoutheastCon).

[17]  Thibault Helleputte,et al.  Robust biomarker identification for cancer diagnosis with ensemble feature selection methods , 2010, Bioinform..

[18]  Francisco Herrera,et al.  A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms , 2011, Swarm Evol. Comput..

[19]  Michel Gendreau,et al.  Hyper-heuristics: a survey of the state of the art , 2013, J. Oper. Res. Soc..

[20]  Ender Özcan,et al.  Exploring heuristic interactions in constraint satisfaction problems: A closer look at the hyper-heuristic space , 2013, 2013 IEEE Congress on Evolutionary Computation.

[21]  Edmund K. Burke,et al.  Modified Choice Function Heuristic Selection for the Multidimensional Knapsack Problem , 2014, ICGEC.

[22]  Ferat Sahin,et al.  A survey on feature selection methods , 2014, Comput. Electr. Eng..

[23]  Mark Johnston,et al.  Developing a Hyper-Heuristic Using Grammatical Evolution and the Capacitated Vehicle Routing Problem , 2014, SEAL.

[24]  A. B. Petrochenkov,et al.  Structural synthesis of complex electrotechnical equipment on the basis of the constraint satisfaction method , 2015 .

[25]  Ben Paechter,et al.  A Lifelong Learning Hyper-heuristic Method for Bin Packing , 2015, Evolutionary Computation.

[26]  Hugo Terashima-Marín,et al.  Combine and conquer: an evolutionary hyper-heuristic approach for solving constraint satisfaction problems , 2016, Artificial Intelligence Review.

[27]  M. Kubát An Introduction to Machine Learning , 2017, Springer International Publishing.