kernlab - An S4 Package for Kernel Methods in R

kernlab is an extensible package for kernel-based machine learning methods in R. It takes advantage of R's new S4 ob ject model and provides a framework for creating and using kernel-based algorithms. The package contains dot product primitives (kernels), implementations of support vector machines and the relevance vector machine, Gaussian processes, a ranking algorithm, kernel PCA, kernel CCA, and a spectral clustering algorithm. Moreover it provides a general purpose quadratic programming solver, and an incomplete Cholesky decomposition method.

[1]  Gérard Dreyfus,et al.  Single-layer learning revisited: a stepwise procedure for building and training a neural network , 1989, NATO Neurocomputing.

[2]  Thorsten Joachims,et al.  Making large scale SVM learning practical , 1998 .

[3]  Nello Cristianini,et al.  Advances in Kernel Methods - Support Vector Learning , 1999 .

[4]  Stephen J. Wright Modified Cholesky Factorizations in Interior-Point Algorithms for Linear Programming , 1999, SIAM J. Optim..

[5]  Michael I. Jordan,et al.  On Spectral Clustering: Analysis and an algorithm , 2001, NIPS.

[6]  John M. Chambers,et al.  Programming With Data , 1998 .

[7]  Bernhard Schölkopf,et al.  Nonlinear Component Analysis as a Kernel Eigenvalue Problem , 1998, Neural Computation.

[8]  Robert P. W. Duin,et al.  Support vector domain description , 1999, Pattern Recognit. Lett..

[9]  Alexander J. Smola,et al.  Online learning with kernels , 2001, IEEE Transactions on Signal Processing.

[10]  Chih-Jen Lin,et al.  Probability Estimates for Multi-class Classification by Pairwise Coupling , 2003, J. Mach. Learn. Res..

[11]  Ulrich H.-G. Kreßel,et al.  Pairwise classification and support vector machines , 1999 .

[12]  Kurt Hornik,et al.  The support vector machine under test , 2003, Neurocomputing.

[13]  George Eastman House,et al.  Sparse Bayesian Learning and the Relevance Vector Machine , 2001 .

[14]  Koby Crammer,et al.  On the Learnability and Design of Output Codes for Multiclass Problems , 2002, Machine Learning.

[15]  Bernhard Schölkopf,et al.  New Support Vector Algorithms , 2000, Neural Computation.

[16]  Malte Kuss,et al.  The Geometry Of Kernel Canonical Correlation Analysis , 2003 .

[18]  Bernhard Schölkopf,et al.  Estimating the Support of a High-Dimensional Distribution , 2001, Neural Computation.

[19]  Carl E. Rasmussen,et al.  In Advances in Neural Information Processing Systems , 2011 .

[20]  John Platt,et al.  Probabilistic Outputs for Support vector Machines and Comparisons to Regularized Likelihood Methods , 1999 .

[21]  Bernhard Schölkopf,et al.  Learning with kernels , 2001 .

[22]  Bernhard Schölkopf,et al.  Ranking on Data Manifolds , 2003, NIPS.

[23]  Bernhard Schölkopf,et al.  Sparse Kernel Feature Analysis , 2002 .

[24]  李幼升,et al.  Ph , 1989 .

[25]  Eric R. Ziegel,et al.  The Elements of Statistical Learning , 2003, Technometrics.

[26]  Chih-Jen Lin,et al.  A Simple Decomposition Method for Support Vector Machines , 2002, Machine Learning.

[27]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[28]  Vladimir Vapnik,et al.  Statistical learning theory , 1998 .

[29]  R. Vanderbei LOQO:an interior point code for quadratic programming , 1999 .