Entropy and information in neural spike trains: progress on the sampling problem.

The major problem in information theoretic analysis of neural responses and other biological data is the reliable estimation of entropy-like quantities from small samples. We apply a recently introduced Bayesian entropy estimator to synthetic data inspired by experiments, and to real experimental spike trains. The estimator performs admirably even very deep in the undersampled regime, where other techniques fail. This opens new possibilities for the information theoretic analysis of experiments, and may be of general interest as an example of learning from limited data.

[1]  H. Quastler Information theory in psychology : problems and methods , 1955 .

[2]  Ga Miller,et al.  Note on the bias of information estimates , 1955 .

[3]  S. Laughlin A Simple Coding Procedure Enhances a Neuron's Information Capacity , 1981, Zeitschrift fur Naturforschung. Section C, Biosciences.

[4]  Shang‐keng Ma Calculation of entropy from data of motion , 1981 .

[5]  P. Grassberger Finite sample corrections to entropy and dimension estimates , 1988 .

[6]  William Bialek,et al.  Reading a Neural Code , 1991, NIPS.

[7]  Andrew R. Barron,et al.  Information-theoretic asymptotics of Bayes methods , 1990, IEEE Trans. Inf. Theory.

[8]  Michael I. Jordan,et al.  Advances in Neural Information Processing Systems 30 , 1995 .

[9]  J P Miller,et al.  Representation of sensory information in the cricket cercal sensory system. II. Information theoretic calculation of system accuracy and optimal tuning-curve widths of four primary interneurons. , 1991, Journal of neurophysiology.

[10]  W. Bialek,et al.  Statistical mechanics and visual signal processing , 1994, cond-mat/9401072.

[11]  David R. Wolf,et al.  Estimating functions of probability distributions from a finite set of samples. , 1994, Physical review. E, Statistical physics, plasmas, fluids, and related interdisciplinary topics.

[12]  T. Sejnowski,et al.  Reliability of spike timing in neocortical neurons. , 1995, Science.

[13]  William Bialek,et al.  Spikes: Exploring the Neural Code , 1996 .

[14]  J. Widdicombe Sensory mechanisms. , 1996, Pulmonary pharmacology.

[15]  Stefano Panzeri,et al.  Analytical estimates of limited sampling biases in different information measures. , 1996, Network.

[16]  William Bialek,et al.  Entropy and Information in Neural Spike Trains , 1996, cond-mat/9603127.

[17]  Jonathan D. Victor,et al.  Metric-space analysis of spike trains: theory, algorithms and application , 1998, q-bio/0309031.

[18]  A. Steele Predictability , 1997, The British journal of ophthalmology.

[19]  G D Lewen,et al.  Reproducibility and Variability in Neural Spike Trains , 1997, Science.

[20]  Michael J. Berry,et al.  The structure and precision of retinal spike trains. , 1997, Proceedings of the National Academy of Sciences of the United States of America.

[21]  Alexander Borst,et al.  Information theory and neural coding , 1999, Nature Neuroscience.

[22]  William Bialek,et al.  Adaptive Rescaling Maximizes Information Transmission , 2000, Neuron.

[23]  R. Reid,et al.  Temporal Coding of Visual Information in the Thalamus , 2000, The Journal of Neuroscience.

[24]  Ilya Nemenman,et al.  Information theory and learning: a physical approach , 2000, ArXiv.

[25]  William Bialek,et al.  Synergy in a Neural Code , 2000, Neural Computation.

[26]  G D Lewen,et al.  Neural coding of naturalistic motion stimuli , 2001, Network.

[27]  F. Mechler,et al.  Temporal coding of contrast in primary visual cortex: when, what, and why. , 2001, Journal of neurophysiology.

[28]  Adrienne L. Fairhall,et al.  Efficiency and ambiguity in an adaptive neural code , 2001, Nature.

[29]  Naftali Tishby,et al.  Predictability, Complexity, and Learning , 2000, Neural Computation.

[30]  D. Wilkin,et al.  Neuron , 2001, Brain Research.

[31]  William Bialek,et al.  Entropy and Inference, Revisited , 2001, NIPS.

[32]  W. Bialek,et al.  Timing and Counting Precision in the Blowfly Visual System , 2002, physics/0202014.

[33]  A. U.S.,et al.  Predictability , Complexity , and Learning , 2002 .

[34]  Koby Crammer,et al.  Advances in Neural Information Processing Systems 14 , 2002 .

[35]  Ronitt Rubinfeld,et al.  The complexity of approximating entropy , 2002, STOC '02.

[36]  J. Victor Binless strategies for estimation of information from neural data. , 2002, Physical review. E, Statistical, nonlinear, and soft matter physics.

[37]  I. Nemenman Inference of entropies of discrete random variables with unknown cardinalities , 2002, physics/0207009.

[38]  Liam Paninski,et al.  Estimation of Entropy and Mutual Information , 2003, Neural Computation.

[39]  T. Collett,et al.  Chasing behaviour of houseflies (Fannia canicularis) , 1974, Journal of comparative physiology.

[40]  宁北芳,et al.  疟原虫var基因转换速率变化导致抗原变异[英]/Paul H, Robert P, Christodoulou Z, et al//Proc Natl Acad Sci U S A , 2005 .