Population Coding, Bayesian Inference and Information Geometry
暂无分享,去创建一个
The present talk focuses on stochastic computation in the brain. The brain represents stimuli from the outer world by excitations of neurons. Neural firing is stochastic, so that these excitation patterns are noisy and fluctuating. How can reliable computation be performed in such a noisy environment? Population coding studies this problem by the neural representation of stimuli in a population of neurons. We first study statistical theory of population coding. It is believed that the brain keeps and processes information in the form of probability distributions before the final output command is decided. Bayesian inference is useful for such purpose. We then show a new idea how the brain integrates various stochastic evidences coming from different modalities. This is the problem how various probability distributions are combined to give a more reliable one. Information geometry is a method to study the structure underlying probability distributions by using modern differential geometry. We show how information geometrical concepts are useful for studying mathematical neuroscience.
[1] Shun-ichi Amari,et al. Information-Geometric Measure for Neural Spikes , 2002, Neural Computation.
[2] Shun-ichi Amari,et al. Methods of information geometry , 2000 .
[3] Yutaka Sakai,et al. Synchronous Firing and Higher-Order Interactions in Neuron Pool , 2003, Neural Computation.
[4] Si Wu,et al. Asymptotic behaviors of population codes , 2002, Neurocomputing.
[5] Si Wu,et al. Population Coding with Correlation and an Unfaithful Model , 2001, Neural Computation.