Estimating the Entropy Rate of Spike Trains via Lempel-Ziv Complexity

Normalized Lempel-Ziv complexity, which measures the generation rate of new patterns along a digital sequence, is closely related to such important source properties as entropy and compression ratio, but, in contrast to these, it is a property of individual sequences. In this article, we propose to exploit this concept to estimate (or, at least, to bound from below) the entropy of neural discharges (spike trains). The main advantages of this method include fast convergence of the estimator (as supported by numerical simulation) and the fact that there is no need to know the probability law of the process generating the signal. Furthermore, we present numerical and experimental comparisons of the new method against the standard method based on word frequencies, providing evidence that this new approach is an alternative entropy estimator for binned spike trains.

[1]  Jacob Ziv,et al.  Coding theorems for individual sequences , 1978, IEEE Trans. Inf. Theory.

[2]  Liam Paninski,et al.  Estimation of Entropy and Mutual Information , 2003, Neural Computation.

[3]  BsnNr C. Srorn,et al.  CLASSIFYING SIMPLE AND COMPLEX CELLS ON THE BASIS OF RESPONSE MODULATION , 2002 .

[4]  R. Reid,et al.  Temporal Coding of Visual Information in the Thalamus , 2000, The Journal of Neuroscience.

[5]  Thomas M. Cover,et al.  Elements of Information Theory , 2005 .

[6]  Abraham Lempel,et al.  On the Complexity of Finite Sequences , 1976, IEEE Trans. Inf. Theory.

[7]  Maria V. Sanchez-Vives,et al.  Membrane Mechanisms Underlying Contrast Adaptation in Cat Area 17In Vivo , 2000, The Journal of Neuroscience.

[8]  José María Amigó,et al.  Characterizing spike trains with Lempel-Ziv complexity , 2004, Neurocomputing.

[9]  L. Kocarev,et al.  The permutation entropy rate equals the metric entropy rate for ergodic information sources and ergodic dynamical systems , 2005, nlin/0503044.

[10]  Jonathon Shlens,et al.  Estimating Entropy Rates with Bayesian Confidence Intervals , 2005, Neural Computation.

[11]  M V Sanchez-Vives,et al.  On the number of states of the neuronal sources. , 2003, Bio Systems.

[12]  Sang Joon Kim,et al.  A Mathematical Theory of Communication , 2006 .

[13]  Maria V. Sanchez-Vives,et al.  Cellular Mechanisms of Long-Lasting Adaptation in Visual Cortical Neurons In Vitro , 2000, The Journal of Neuroscience.

[14]  Alexander Borst,et al.  Information theory and neural coding , 1999, Nature Neuroscience.

[15]  W. McCulloch,et al.  The limiting information capacity of a neuronal link , 1952 .

[16]  Abraham Lempel,et al.  Compression of individual sequences via variable-rate coding , 1978, IEEE Trans. Inf. Theory.

[17]  William Bialek,et al.  Spikes: Exploring the Neural Code , 1996 .

[18]  William Bialek,et al.  Entropy and Information in Neural Spike Trains , 1996, cond-mat/9603127.

[19]  A. B. Bonds,et al.  Classifying simple and complex cells on the basis of response modulation , 1991, Vision Research.