Coulomb Classifiers: Generalizing Support Vector Machines via an Analogy to Electrostatic Systems

We introduce a family of classifiers based on a physical analogy to an electrostatic system of charged conductors. The family, called Coulomb classifiers, includes the two best-known support-vector machines (SVMs), the v-SVM and the C-SVM. In the electrostatics analogy, a training example corresponds to a charged conductor at a given location in space, the classification function corresponds to the electrostatic potential function, and the training objective function corresponds to the Coulomb energy. The electrostatic framework provides not only a novel interpretation of existing algorithms and their interrelationships, but it suggests a variety of new methods for SVMs including kernels that bridge the gap between polynomial and radial-basis functions, objective functions that do not require positive-definite kernels, regularization techniques that allow for the construction of an optimal classifier in Minkowski space. Based on the framework, we propose novel SVMs and perform simulation studies to show that they are comparable or superior to standard SVMs. The experiments include classification tasks on data which are represented in terms of their pairwise proximities, where a Coulomb Classifier outperformed standard SVMs.

[1]  S. Hochreiter,et al.  Coulomb Classi ers: Reinterpreting SVMs as Electrostatic Systems , 2022 .

[2]  Michael S. Warren,et al.  Skeletons from the treecode closet , 1994 .

[3]  R. C. Williamson,et al.  Classification on proximity data with LP-machines , 1999 .

[4]  Lev Kantorovich,et al.  Electrostatic energy calculation for the interpretation of scanning probe microscopy experiments , 2000 .

[5]  Gunnar Rätsch,et al.  Soft Margins for AdaBoost , 2001, Machine Learning.

[6]  Joachim M. Buhmann,et al.  Pairwise Data Clustering by Deterministic Annealing , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[7]  C. Blakemore,et al.  Analysis of connectivity in the cat cerebral cortex , 1995, The Journal of neuroscience : the official journal of the Society for Neuroscience.

[8]  Vladimir N. Vapnik,et al.  The Nature of Statistical Learning Theory , 2000, Statistics for Engineering and Information Science.

[9]  Anthony Widjaja,et al.  Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond , 2003, IEEE Transactions on Neural Networks.

[10]  J. Mercer Functions of Positive and Negative Type, and their Connection with the Theory of Integral Equations , 1909 .

[11]  Michael C. Mozer,et al.  Coulomb Classifiers: Reinterpreting SVMs as Electrostatic Systems ; CU-CS-921-01 , 2001 .

[12]  M. Aizerman,et al.  Theoretical Foundations of the Potential Function Method in Pattern Recognition Learning , 1964 .

[13]  Bernhard Schölkopf,et al.  New Support Vector Algorithms , 2000, Neural Computation.

[14]  E. G. Cullwick Principles of Electrodynamics , 1967 .