Towards a Surrogate-Assisted Multi-Objective Full Model Selection

This research approaches the full model selection problem. The full model selection problem is defined as a method in which, given a pool of pre-processing methods, feature selection and learning algorithms, to choose from, we select a combination of them, together with their hyper-parameters,in such a way, that we can provide the “best” generalization performance on a given dataset. We propose to face this as a multi-objective optimization problem, where the classification-error and the model complexity are defined as the objectives to be minimized. We propose to use a surrogate-assisted multi-objective evolutionary algorithm approach to explore the models space. Our proposal derives from the fact that estimating the values of the objective function could be computationally expensive. Therefore, by using surrogate-assisted optimization we expect to reduce the number of full models that should be trained and tested so that we can reduce the total number of fitness function evaluations, without degrading, in a significant manner, the quality of the models. Our preliminary results give evidence of the validity of our proposed approach.

[1]  Hugo Jair Escalante,et al.  Bias and Variance Multi-objective Optimization for Support Vector Machines Model Selection , 2013, IbPRIA.

[2]  Hugo Jair Escalante,et al.  Particle Swarm Model Selection , 2009, J. Mach. Learn. Res..

[3]  Mehmet Karaköse,et al.  A multi-objective artificial immune algorithm for parameter optimization in support vector machine , 2011, Appl. Soft Comput..

[4]  André Carlos Ponce de Leon Ferreira de Carvalho,et al.  Multi-objective optimization and Meta-learning for SVM parameter selection , 2012, The 2012 International Joint Conference on Neural Networks (IJCNN).

[5]  Weiguo Gong,et al.  Multi-objective uniform design as a SVM model selection tool for face recognition , 2011, Expert Syst. Appl..

[6]  Gary B. Lamont,et al.  Evolutionary Algorithms for Solving Multi-Objective Problems , 2002, Genetic Algorithms and Evolutionary Computation.

[7]  Yann LeCun,et al.  Measuring the VC-Dimension of a Learning Machine , 1994, Neural Computation.

[8]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[9]  Filip De Turck,et al.  Evolutionary Model Type Selection for Global Surrogate Modeling , 2009, J. Mach. Learn. Res..

[10]  Zhongyi Hu,et al.  A PSO and pattern search based memetic algorithm for SVMs parameters optimization , 2013, Neurocomputing.

[11]  Ching Y. Suen,et al.  Automatic model selection for the optimization of SVM kernels , 2005, Pattern Recognit..

[12]  Giorgio Valentini,et al.  Bias-Variance Analysis of Support Vector Machines for the Development of SVM-Based Ensemble Methods , 2004, J. Mach. Learn. Res..

[13]  Janez Demsar,et al.  Statistical Comparisons of Classifiers over Multiple Data Sets , 2006, J. Mach. Learn. Res..

[14]  Qingfu Zhang,et al.  MOEA/D: A Multiobjective Evolutionary Algorithm Based on Decomposition , 2007, IEEE Transactions on Evolutionary Computation.

[15]  Hugo Jair Escalante,et al.  Bias and Variance Optimization for SVMs Model Selection , 2013, FLAIRS Conference.

[16]  David H. Wolpert,et al.  The Lack of A Priori Distinctions Between Learning Algorithms , 1996, Neural Computation.

[17]  Yves Lecourtier,et al.  A multi-model selection framework for unknown and/or evolutive misclassification cost problems , 2010, Pattern Recognit..

[18]  Sayan Mukherjee,et al.  Choosing Multiple Parameters for Support Vector Machines , 2002, Machine Learning.

[19]  André Carlos Ponce de Leon Ferreira de Carvalho,et al.  Combining Meta-learning and Search Techniques to SVM Parameter Selection , 2010, 2010 Eleventh Brazilian Symposium on Neural Networks.

[20]  Christian Igel,et al.  Multi-Objective Optimization of Support Vector Machines , 2006, Multi-Objective Machine Learning.

[21]  Martin J. Oates,et al.  PESA-II: region-based selection in evolutionary multiobjective optimization , 2001 .

[22]  Andreas Dengel,et al.  Meta-learning for evolutionary parameter optimization of classifiers , 2012, Machine Learning.

[23]  Gunnar Rätsch,et al.  Soft Margins for AdaBoost , 2001, Machine Learning.

[24]  Kalyanmoy Deb,et al.  A fast and elitist multiobjective genetic algorithm: NSGA-II , 2002, IEEE Trans. Evol. Comput..