VC Dimension of an Integrate-and-Fire Neuron Model

We compute the VC dimension of a leaky integrate-and-fire neuron model. The VC dimension quantifies the ability of a function class to partition an input pattern space, and can be considered a measure of computational capacity. In this case, the function class is the class of integrate-and-fire models generated by varying the integration time constant T and the threshold θ, the input space they partition is the space of continuous-time signals, and the binary partition is specified by whether or not the model reaches threshold at some specified time. We show that the VC dimension diverges only logarithmically with the input signal bandwidth N. We also extend this approach to arbitrary passive dendritic trees. The main contributions of this work are (1) it offers a novel treatment of computational capacity of this class of dynamic system; and (2) it provides a framework for analyzing the computational capabilities of the dynamic systems defined by networks of spiking neurons.

[1]  G. Shepherd,et al.  Logic operations are properties of computer-simulated interactions between excitable dendritic spines , 1987, Neuroscience.

[2]  Joel L. Davis,et al.  Single neuron computation , 1992 .

[3]  David Haussler,et al.  What Size Net Gives Valid Generalization? , 1989, Neural Computation.

[4]  Wolfgang Maass,et al.  Lower Bounds for the Computational Power of Networks of Spiking Neurons , 1996, Neural Computation.

[5]  Philip M. Long,et al.  Fat-shattering and the learnability of real-valued functions , 1994, COLT '94.

[6]  T. Poggio,et al.  Retinal ganglion cells: a functional interpretation of dendritic morphology. , 1982, Philosophical transactions of the Royal Society of London. Series B, Biological sciences.

[7]  David Haussler,et al.  Rigorous Learning Curve Bounds from Statistical Mechanics , 1994, COLT.

[8]  David Haussler,et al.  Learnability and the Vapnik-Chervonenkis dimension , 1989, JACM.

[9]  Bartlett W. Mel NMDA-Based Pattern Discrimination in a Modeled Cortical Neuron , 1992, Neural Computation.

[10]  Vladimir Vapnik,et al.  Chervonenkis: On the uniform convergence of relative frequencies of events to their probabilities , 1971 .

[11]  Leslie G. Valiant,et al.  A theory of the learnable , 1984, STOC '84.

[12]  Yaser S. Abu-Mostafa,et al.  The Vapnik-Chervonenkis Dimension: Information versus Complexity in Learning , 1989, Neural Computation.

[13]  Anthony M. Zador,et al.  Nonlinear Pattern Separation in Single Hippocampal Neurons with Active Dendritic Membrane , 1991, NIPS.