Population Coding, Bayesian Inference and Information Geometry

The present talk focuses on stochastic computation in the brain. The brain represents stimuli from the outer world by excitations of neurons. Neural firing is stochastic, so that these excitation patterns are noisy and fluctuating. How can reliable computation be performed in such a noisy environment? Population coding studies this problem by the neural representation of stimuli in a population of neurons. We first study statistical theory of population coding. It is believed that the brain keeps and processes information in the form of probability distributions before the final output command is decided. Bayesian inference is useful for such purpose. We then show a new idea how the brain integrates various stochastic evidences coming from different modalities. This is the problem how various probability distributions are combined to give a more reliable one. Information geometry is a method to study the structure underlying probability distributions by using modern differential geometry. We show how information geometrical concepts are useful for studying mathematical neuroscience.