Support Vector Machines for Regression Problems with Sequential Minimal Optimization

Training a support vector machine (SVM) is usually done by mapping the underlying optimization problem into a quadratic programming (QP) problem. Unfortunately, high quality QP solvers are not readily available, which makes research into the area of SVMs difficult for the those without a QP solver. Recently, the Sequential Minimal Optimization algorithm (SMO) was introduced [1, 2]. SMO reduces SVM training down to a series of smaller QP subproblems that have an analytical solution and, therefore, does not require a general QP solver. SMO has been shown to be very efficient for classification problems using linear SVMs and/or sparse data sets. This work shows how SMO can be generalized to handle regression problems.

[1]  John C. Platt,et al.  Fast training of support vector machines using sequential minimal optimization, advances in kernel methods , 1999 .

[2]  Federico Girosi,et al.  An improved training algorithm for support vector machines , 1997, Neural Networks for Signal Processing VII. Proceedings of the 1997 IEEE Signal Processing Society Workshop.

[3]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[4]  Gary William Flake Heuristics for Improving the Performance of Online SVM Training Algorithms , 2007 .