Sample complexity for learning recurrent perceptron mappings

Recurrent perceptron classifiers generalize the classical perceptron model. They take into account those correlations and dependences among input coordinates which arise from linear digital filtering. This paper provides tight bounds on sample complexity associated to the fitting of such models to experimental data.

[1]  S. Basu,et al.  A New Algorithm to Find a Point in Every Cell Defined by a Family of Polynomials , 1998 .

[2]  Paul W. Goldberg,et al.  Bounding the Vapnik-Chervonenkis Dimension of Concept Classes Parameterized by Real Numbers , 1993, COLT '93.

[3]  Peter E. Hart,et al.  Pattern classification and scene analysis , 1974, A Wiley-Interscience publication.

[4]  David Haussler,et al.  Decision Theoretic Generalizations of the PAC Model for Neural Net and Other Learning Applications , 1992, Inf. Comput..

[5]  BasuSaugata,et al.  On the combinatorial and algebraic complexity of quantifier elimination , 1996 .

[6]  M. Redington Neural Networks for Speech and Sequence Recognition , 1996 .

[7]  R. Dudley A course on empirical processes , 1984 .

[8]  Richard O. Duda,et al.  Pattern classification and scene analysis , 1974, A Wiley-Interscience publication.

[9]  György Turán,et al.  Computational Learning Theory and Neural Networks: A Survey of Selected Topics , 1994 .

[10]  Wolfgang Maass,et al.  Perspectives of Current Research about the Complexity of Learning on Neural Nets , 1994 .

[11]  D. Delchamps Extracting state information form a quantized output record , 1990 .

[12]  Eduardo D. Sontag,et al.  Neural Networks with Quadratic VC Dimension , 1995, J. Comput. Syst. Sci..

[13]  D. Pollard Empirical Processes: Theory and Applications , 1990 .

[14]  C. Lee Giles,et al.  Higher Order Recurrent Networks and Grammatical Inference , 1989, NIPS.

[15]  Eduardo Sontag,et al.  Linear systems with sign-observations , 1993 .

[16]  Ah Chung Tsoi,et al.  FIR and IIR Synapses, a New Neural Network Architecture for Time Series Modeling , 1991, Neural Computation.

[17]  Marie-Françoise Roy,et al.  Complexity of the Computation on Real Algebraic Numbers , 1990, J. Symb. Comput..

[18]  Leslie G. Valiant,et al.  A theory of the learnable , 1984, STOC '84.

[19]  Rodney A. Kennedy,et al.  Neural network structure for emulating decision feedback equalisers , 1991, [Proceedings] ICASSP 91: 1991 International Conference on Acoustics, Speech, and Signal Processing.

[20]  Eduardo D. Sontag,et al.  Neural Networks for Control , 1993 .

[21]  Eduardo D. Sontag,et al.  Feedforward Nets for Interpolation and Classification , 1992, J. Comput. Syst. Sci..

[22]  Michel Coste,et al.  Thom's Lemma, the Coding of Real Algebraic Numbers and the Computation of the Topology of Semi-Algebraic Sets , 1988, J. Symb. Comput..

[23]  David Haussler,et al.  Learnability and the Vapnik-Chervonenkis dimension , 1989, JACM.

[24]  Soura Dasgupta,et al.  On the similarity of conditions for an open-eye channel and for signed filtered error adaptive filter stability , 1991, [1991] Proceedings of the 30th IEEE Conference on Decision and Control.

[25]  Ah Chung Tsoi,et al.  A Comparison of Discrete-Time Operator Models and for Nonlinear System Identification , 1994, NIPS.

[26]  D SontagEduardo Feedforward nets for interpolation and classification , 1992 .

[27]  Eduardo D. Sontag,et al.  Shattering All Sets of k Points in General Position Requires (k 1)/2 Parameters , 1997, Neural Computation.