From regularization to radial, tensor and additive splines

Poggio and Girosi showed that regularization principles lead to approximation schemes which are equivalent to networks with one layer of hidden units, called regularization networks. They summarize their results (1993) that show that regularization networks encompass a much broader range of approximation schemes, including many of the general additive models and some of the neural networks. In particular, additive splines as well as some tensor product splines can be obtained from appropriate classes of smoothness functionals. The same extension that extends radial basis functions to hyper basis functions leads from additive models to ridge approximation models, containing as special cases Breiman's hinge functions and some forms of projection pursuit regression. The authors propose to use the term generalized regularization networks for this broad class of approximation schemes that follow from an extension of regularization.<<ETX>>

[1]  A Tikhonov,et al.  Solution of Incorrectly Formulated Problems and the Regularization Method , 1963 .

[2]  R. N. Desmarais,et al.  Interpolation using surface splines. , 1972 .

[3]  Jean Duchon,et al.  Splines minimizing rotation-invariant semi-norms in Sobolev spaces , 1976, Constructive Theory of Functions of Several Variables.

[4]  Ray J. Solomonoff,et al.  Complexity-based induction systems: Comparisons and convergence theorems , 1978, IEEE Trans. Inf. Theory.

[5]  J. Rissanen,et al.  Modeling By Shortest Data Description* , 1978, Autom..

[6]  J. Friedman,et al.  Projection Pursuit Regression , 1981 .

[7]  S. Rippa,et al.  Numerical Procedures for Surface Fitting of Scattered Data by Radial Functions , 1986 .

[8]  C. Micchelli Interpolation of scattered data: Distance matrices and conditionally positive definite functions , 1986 .

[9]  Robin Sibson,et al.  What is projection pursuit , 1987 .

[10]  Nira Dyn,et al.  Interpolation of scattered Data by radial Functions , 1987, Topics in Multivariate Approximation.

[11]  R. Tibshirani,et al.  Generalized Additive Models: Some Applications , 1987 .

[12]  Tomaso Poggio,et al.  Probabilistic Solution of Ill-Posed Problems in Computational Vision , 1987 .

[13]  M. J. D. Powell,et al.  Radial basis functions for multivariable interpolation: a review , 1987 .

[14]  M. Bertero,et al.  Ill-posed problems in early vision , 1988, Proc. IEEE.

[15]  David S. Broomhead,et al.  Multivariable Functional Interpolation and Adaptive Networks , 1988, Complex Syst..

[16]  D. Broomhead,et al.  Radial Basis Functions, Multi-Variable Functional Interpolation and Adaptive Networks , 1988 .

[17]  W. Madych,et al.  Multivariate interpolation and condi-tionally positive definite functions , 1988 .

[18]  John Moody,et al.  Fast Learning in Networks of Locally-Tuned Processing Units , 1989, Neural Computation.

[19]  R. Tibshirani,et al.  Linear Smoothers and Additive Models , 1989 .

[20]  I. Johnstone,et al.  Projection-Based Approximation and a Duality with Kernel Methods , 1989 .

[21]  F. Girosi,et al.  Networks for approximation and learning , 1990, Proc. IEEE.

[22]  F. Girosi,et al.  A Nondeterministic Minimization Algorithm , 1990 .

[23]  T Poggio,et al.  Regularization Algorithms for Learning That Are Equivalent to Multilayer Networks , 1990, Science.

[24]  G. Wahba Spline models for observational data , 1990 .

[25]  Tomaso A. Poggio,et al.  Extensions of a Theory of Networks for Approximation and Learning , 1990, NIPS.

[26]  Leo Breiman,et al.  Hinging hyperplanes for regression, classification, and function approximation , 1993, IEEE Trans. Inf. Theory.

[27]  I. Omiaj,et al.  Extensions of a Theory of Networks for Approximation and Learning : dimensionality reduction and clustering , 2022 .