From the Entropy to the Statistical Structure of Spike Trains

We use statistical estimates of the entropy rate of spike train data in order to make inferences about the underlying structure of the spike train itself. We first examine a number of different parametric and nonparametric estimators (some known and some new), including the "plug-in" method, several versions of Lempel-Ziv-based compression algorithms, a maximum likelihood estimator tailored to renewal processes, and the natural estimator derived from the context-tree weighting method (CTW). The theoretical properties of these estimators are examined, several new theoretical results are developed, and all estimators are systematically applied to various types of synthetic data and under different conditions. Our main focus is on the performance of these entropy estimators on the (binary) spike trains of 28 neurons recorded simultaneously for a one-hour period from the primary motor and dorsal premotor cortices of a monkey. We show how the entropy estimates can be used to test for the existence of long-term structure in the data, and we construct a hypothesis test for whether the renewal process model is appropriate for these spike trains. Further, by applying the CTW algorithm we derive the maximum a posterior (MAP) tree model of our empirical data, and comment on the underlying structure it reveals

[1]  PaninskiLiam Estimation of entropy and mutual information , 2003 .

[2]  P.A.J. Volf,et al.  On the context tree maximizing algorithm , 1995, Proceedings of 1995 IEEE International Symposium on Information Theory.

[3]  G D Lewen,et al.  Reproducibility and Variability in Neural Spike Trains , 1997, Science.

[4]  Claude Shannon Information theory in the brain , 2000 .

[5]  Pamela Reinagel,et al.  Decoding visual information from a population of retinal ganglion cells. , 1997, Journal of neurophysiology.

[6]  William Bialek,et al.  Spikes: Exploring the Neural Code , 1996 .

[7]  William Bialek,et al.  Entropy and Information in Neural Spike Trains , 1996, cond-mat/9603127.

[8]  D. Rubin,et al.  Maximum likelihood from incomplete data via the EM - algorithm plus discussions on the paper , 1977 .

[9]  Idan Segev,et al.  The information efficacy of a synapse , 2002, Nature Neuroscience.

[10]  A. Mees,et al.  Context-tree modeling of observed symbolic dynamics. , 2002, Physical review. E, Statistical, nonlinear, and soft matter physics.

[11]  Abraham Lempel,et al.  A universal algorithm for sequential data compression , 1977, IEEE Trans. Inf. Theory.

[12]  Frans M. J. Willems,et al.  The Context-Tree Weighting Method : Extensions , 1998, IEEE Trans. Inf. Theory.

[13]  William Bialek,et al.  Entropy and information in neural spike trains: progress on the sampling problem. , 2003, Physical review. E, Statistical, nonlinear, and soft matter physics.

[14]  Y. Shtarkov,et al.  The context-tree weighting method: basic properties , 1995, IEEE Trans. Inf. Theory.

[15]  Frans M. J. Willems,et al.  The context-tree weighting method: basic properties , 1995, IEEE Trans. Inf. Theory.

[16]  Anthony M. Zador,et al.  Information through a Spiking Neuron , 1995, NIPS.

[17]  J. Donoghue,et al.  Neuronal Interactions Improve Cortical Population Coding of Movement Direction , 1999, The Journal of Neuroscience.

[18]  Liam Paninski,et al.  Estimation of Entropy and Mutual Information , 2003, Neural Computation.

[19]  Peter Grassberger,et al.  Entropy estimation of symbol sequences. , 1996, Chaos.

[20]  Yuri M. Suhov,et al.  Nonparametric Entropy Estimation for Stationary Processesand Random Fields, with Applications to English Text , 1998, IEEE Trans. Inf. Theory.