Principles of real-time computing with feedback applied to cortical microcircuit models

The network topology of neurons in the brain exhibits an abundance of feedback connections, but the computational function of these feedback connections is largely unknown. We present a computational theory that characterizes the gain in computational power achieved through feedback in dynamical systems with fading memory. It implies that many such systems acquire through feedback universal computational capabilities for analog computing with a non-fading memory. In particular, we show that feedback enables such systems to process time-varying input streams in diverse ways according to rules that are implemented through internal states of the dynamical system. In contrast to previous attractor-based computational models for neural networks, these flexible internal states are high-dimensional attractors of the circuit dynamics, that still allow the circuit state to absorb new information from online input streams. In this way one arrives at novel models for working memory, integration of evidence, and reward expectation in cortical circuits. We show that they are applicable to circuits of conductance-based Hodgkin-Huxley (HH) neurons with high levels of noise that reflect experimental data on in-vivo conditions.

[1]  Eduardo D. Sontag,et al.  Computational Aspects of Feedback in Neural Circuits , 2006, PLoS Comput. Biol..

[2]  H. Markram,et al.  Differential signaling via the same axon of neocortical pyramidal neurons. , 1998, Proceedings of the National Academy of Sciences of the United States of America.

[3]  M. Shadlen,et al.  A role for neural integrators in perceptual decision making. , 2003, Cerebral cortex.

[4]  Henry Markram,et al.  Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations , 2002, Neural Computation.

[5]  A. Destexhe,et al.  The high-conductance state of neocortical neurons in vivo , 2003, Nature Reviews Neuroscience.

[6]  H Sebastian Seung,et al.  Plasticity and tuning by visual feedback of the stability of a neural integrator. , 2004, Proceedings of the National Academy of Sciences of the United States of America.

[7]  Mike Casey,et al.  The Dynamics of Discrete-Time Computation, with Application to Recurrent Neural Networks and Finite State Machine Extraction , 1996, Neural Computation.

[8]  Pekka Orponen,et al.  On the Effect of Analog Noise in Discrete-Time Analog Computations , 1996, Neural Computation.

[9]  H. Markram,et al.  Organizing principles for a diversity of GABAergic interneurons and synapses in the neocortex. , 2000, Science.

[10]  Harald Haas,et al.  Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication , 2004, Science.

[11]  Michael S. Branicky,et al.  Universal Computation and Other Capabilities of Hybrid and Continuous Dynamical Systems , 1995, Theor. Comput. Sci..

[12]  Wolfgang Maass,et al.  Movement Generation with Circuits of Spiking Neurons , 2005, Neural Computation.