Accurate On-line Support Vector Regression

Batch implementations of support vector regression (SVR) are inefficient when used in an on-line setting because they must be retrained from scratch every time the training set is modified. Following an incremental support vector classification algorithm introduced by Cauwenberghs and Poggio (2001), we have developed an accurate on-line support vector regression (AOSVR) that efficiently updates a trained SVR function whenever a sample is added to or removed from the training set. The updated SVR function is identical to that produced by a batch algorithm. Applications of AOSVR in both on-line and cross-validation scenarios are presented.Inbothscenarios, numerical experiments indicate that AOSVR is faster than batch SVR algorithms with both cold and warm start.

[1]  Leonard J. Tashman,et al.  Out-of-sample tests of forecasting accuracy: an analysis and review , 2000 .

[2]  J. Fliege,et al.  Constructing approximations to the efficient set of convex quadratic multiobjective problems , 2004 .

[3]  Stefan Rüping,et al.  Incremental Learning with Support Vector Machines , 2001, ICDM.

[4]  Yi Li,et al.  The Relaxed Online Maximum Margin Algorithm , 1999, Machine Learning.

[5]  Chih-Jen Lin,et al.  Training v-Support Vector Regression: Theory and Algorithms , 2002, Neural Computation.

[6]  Manfred Opper,et al.  Sparse Representation for Gaussian Process Models , 2000, NIPS.

[7]  Rodrigo Fernandez Predicting Time Series with a Local Support Vector Regression Machine , 1999 .

[8]  Thore Graepel,et al.  From Margin to Sparsity , 2000, NIPS.

[9]  Jacek Gondzio,et al.  Warm start of the primal-dual method applied in the cutting-plane scheme , 1998, Math. Program..

[10]  Mario Martín Muñoz On-line support vector machines for function approximation , 2002 .

[11]  Jacek Gondzio,et al.  Reoptimization With the Primal-Dual Interior Point Method , 2002, SIAM J. Optim..

[12]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[13]  F. Tay,et al.  Application of support vector machines in financial time series forecasting , 2001 .

[14]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .

[15]  Stephen J. Wright,et al.  Warm-Start Strategies in Interior-Point Methods for Linear Programming , 2002, SIAM J. Optim..

[16]  V. Vapnik,et al.  Bounds on Error Expectation for Support Vector Machines , 2000, Neural Computation.

[17]  Bernhard Schölkopf,et al.  A tutorial on support vector regression , 2004, Stat. Comput..

[18]  Gert Cauwenberghs,et al.  Incremental and Decremental Support Vector Machine Learning , 2000, NIPS.

[19]  Liva Ralaivola,et al.  Incremental Support Vector Machine Learning: A Local Approach , 2001, ICANN.

[20]  Takahiro Obara,et al.  Relativistic electron dynamics in the inner magnetosphere — a review , 2002 .

[21]  Gunnar Rätsch,et al.  Predicting Time Series with Support Vector Machines , 1997, ICANN.

[22]  Mark Herbster,et al.  Learning Additive Models Online with Fast Evaluating Kernels , 2001, COLT/EuroCOLT.

[23]  Alexander J. Smola,et al.  Online learning with kernels , 2001, IEEE Transactions on Signal Processing.

[24]  Andreas S. Weigend,et al.  Time Series Prediction: Forecasting the Future and Understanding the Past , 1994 .

[25]  Claudio Gentile,et al.  A New Approximate Maximal Margin Classification Algorithm , 2002, J. Mach. Learn. Res..

[26]  R. Vanderbei LOQO:an interior point code for quadratic programming , 1999 .

[27]  L. Glass,et al.  Oscillation and chaos in physiological control systems. , 1977, Science.

[28]  Thorsten Joachims,et al.  Estimating the Generalization Performance of an SVM Efficiently , 2000, ICML.

[29]  S. Keerthi,et al.  Improvements to SMO Algorithm for SVM Regression 1 , 1999 .