Neural Systems as Nonlinear Filters

Experimental data show that biological synapses behave quite differently from the symbolic synapses in all common artificial neural network models. Biological synapses are dynamic; their weight changes on a short timescale by several hundred percent in dependence of the past input to the synapse. In this article we address the question how this inherent synaptic dynamics (which should not be confused with long term learning) affects the computational power of a neural network. In particular, we analyze computations on temporal and spatiotemporal patterns, and we give a complete mathematical characterization of all filters that can be approximated by feedforward neural networks with dynamic synapses. It turns out that even with just a single hidden layer, such networks can approximate a very rich class of nonlinear filters: all filters that can be characterized by Volterra series. This result is robust with regard to various changes in the model for synaptic dynamics. Our characterization result provides for all nonlinear filters that are approximable by Volterra series a new complexity hierarchy related to the cost of implementing such filters in neural systems.

[1]  Michel Fliess,et al.  Un Outil Algebrique: Les Series Formelles Non Commutatives , 1976 .

[2]  R. de Figueiredo The Volterra and Wiener theories of nonlinear systems , 1982, Proceedings of the IEEE.

[3]  Edwin Hewitt,et al.  Real And Abstract Analysis , 1967 .

[4]  Eduardo D. Sontag,et al.  Sample complexity for learning recurrent perceptron mappings , 1995, IEEE Trans. Inf. Theory.

[5]  Vivien A. Casagrande,et al.  Biophysics of Computation: Information Processing in Single Neurons , 1999 .

[6]  Christopher J. Bishop,et al.  Pulsed Neural Networks , 1998 .

[7]  Henry Markram,et al.  Neural Networks with Dynamic Synapses , 1998, Neural Computation.

[8]  S. Grossberg Some Psychophysiological and Pharmacological Correlates of a Developmental, Cognitive and Motivational Theory a , 1984, Annals of the New York Academy of Sciences.

[9]  H. Sussmann Semigroup Representations, Bilinear Approximation of Input-Output Maps, and Generalized Inputs , 1976 .

[10]  L F Abbott,et al.  Decoding neuronal firing and modelling neural networks , 1994, Quarterly Reviews of Biophysics.

[11]  Philip G. Gallman,et al.  Representations of nonlinear systems via the stone-weierstrass theorem , 1976, Autom..

[12]  Leon O. Chua,et al.  Fading memory and the problem of approximating nonlinear operators with volterra series , 1985 .

[13]  Carver Mead,et al.  Analog VLSI and neural systems , 1989 .

[14]  Wolfgang Maass,et al.  Dynamic Stochastic Synapses as Computational Units , 1997, Neural Computation.

[15]  William Bialek,et al.  Spikes: Exploring the Neural Code , 1996 .

[16]  Eduardo D. Sontag,et al.  A learning result for continuous-time recurrent neural networks 1 1 Supported in part by US Air Forc , 1998 .

[17]  Apostolos P. Georgopoulos Reaching: coding in motor cortex , 1998 .

[18]  A. P. Georgopoulos,et al.  Neuronal population coding of movement direction. , 1986, Science.

[19]  L. Abbott,et al.  A Quantitative Description of Short-Term Plasticity at Excitatory Synapses in Layer 2/3 of Rat Primary Visual Cortex , 1997, The Journal of Neuroscience.

[20]  Eduardo D. Sontag,et al.  Recurrent Neural Networks: Some Systems-Theoretic Aspects , 1997 .

[21]  Allan Pinkus,et al.  Multilayer Feedforward Networks with a Non-Polynomial Activation Function Can Approximate Any Function , 1991, Neural Networks.

[22]  S. Grossberg,et al.  Cortical dynamics of feature binding and reset: Control of visual persistence , 1994, Vision Research.

[23]  Structure theorems for nonlinear systems , 1992, Multidimens. Syst. Signal Process..

[24]  Anders Krogh,et al.  Introduction to the theory of neural computation , 1994, The advanced book program.

[25]  T. Poggio,et al.  The Volterra Representation and the Wiener Expansion: Validity and Pitfalls , 1977 .

[26]  Ah Chung Tsoi,et al.  FIR and IIR Synapses, a New Neural Network Architecture for Time Series Modeling , 1991, Neural Computation.

[27]  Wolfgang Maass,et al.  A Model for Fast Analog Computation Based on Unreliable Synapses , 2000, Neural Computation.

[28]  Wolfgang Maass,et al.  Computing and learning with dynamic synapses , 1999 .

[29]  Eduardo D. Sontag,et al.  Vapnik-Chervonenkis Dimension of Recurrent Neural Networks , 1997, Discret. Appl. Math..

[30]  Eduardo D. Sontag,et al.  Vapnik-Chervonenkis Dimension of Recurrent Neural Networks , 1998, Discret. Appl. Math..

[31]  A. Friedman Foundations of modern analysis , 1970 .

[32]  Kurt Hornik,et al.  Multilayer feedforward networks are universal approximators , 1989, Neural Networks.

[33]  T. Sejnowski,et al.  Heterogeneous Release Properties of Visualized Individual Hippocampal Synapses , 1997, Neuron.

[34]  C. Stevens,et al.  Heterogeneity of Release Probability, Facilitation, and Depletion at Central Synapses , 1997, Neuron.

[35]  Wolfgang Maass,et al.  Spiking Neurons , 1998, NC.

[36]  Wolfgang Maass,et al.  On the Computational Power of Winner-Take-All , 2000, Neural Computation.

[37]  S. Grossberg On the production and release of chemical transmitters and related topics in cellular control. , 1969, Journal of theoretical biology.

[38]  L. Abbott,et al.  Synaptic Depression and Cortical Gain Control , 1997, Science.

[39]  H. Markram,et al.  The neural code between neocortical pyramidal neurons depends on neurotransmitter release probability. , 1997, Proceedings of the National Academy of Sciences of the United States of America.

[40]  Christof Koch,et al.  Biophysics of Computation: Information Processing in Single Neurons (Computational Neuroscience Series) , 1998 .