Learning the Synaptic and Intrinsic Membrane Dynamics Underlying Working Memory in Spiking Neural Network Models

Recurrent neural network (RNN) model trained to perform cognitive tasks is a useful computational tool for understanding how cortical circuits execute complex computations. However, these models are often composed of units that interact with one another using continuous signals and overlook parameters intrinsic to spiking neurons. Here, we developed a method to directly train not only synaptic-related variables but also membrane-related parameters of a spiking RNN model. Training our model on a wide range of cognitive tasks resulted in diverse yet task-specific synaptic and membrane parameters. We also show that fast membrane time constants and slow synaptic decay dynamics naturally emerge from our model when it is trained on tasks associated with working memory (WM). Further dissecting the optimized parameters revealed that fast membrane properties and slow synaptic dynamics are important for encoding stimuli and WM maintenance, respectively. This approach offers a unique window into how connectivity patterns and intrinsic neuronal properties contribute to complex dynamics in neural populations.

[1]  A. Hodgkin,et al.  A quantitative description of membrane current and its application to conduction and excitation in nerve , 1952, The Journal of physiology.

[2]  Geoffrey E. Hinton,et al.  Learning representations by back-propagating errors , 1986, Nature.

[3]  Sommers,et al.  Chaos in random neural networks. , 1988, Physical review letters.

[4]  Paul J. Werbos,et al.  Backpropagation Through Time: What It Does and How to Do It , 1990, Proc. IEEE.

[5]  B. Connors,et al.  Intrinsic firing patterns of diverse neocortical neurons , 1990, Trends in Neurosciences.

[6]  A. Hodgkin,et al.  A quantitative description of membrane current and its application to conduction and excitation in nerve , 1990 .

[7]  Kazuyuki Aihara,et al.  Chaotic simulated annealing by a neural network model with transient chaos , 1995, Neural Networks.

[8]  Y. Ermoliev,et al.  The Minimization of Semicontinuous Functions: Mollifier Subgradients , 1995 .

[9]  P. Goldman-Rakic Cellular basis of working memory , 1995, Neuron.

[10]  R. Romo,et al.  Neuronal correlates of parametric working memory in the prefrontal cortex , 1999, Nature.

[11]  Yoshua Bengio,et al.  Gradient Flow in Recurrent Nets: the Difficulty of Learning Long-Term Dependencies , 2001 .

[12]  S. Thorpe,et al.  Spike times make sense , 2005, Trends in Neurosciences.

[13]  Wulfram Gerstner,et al.  Adaptive exponential integrate-and-fire model as an effective description of neuronal activity. , 2005, Journal of neurophysiology.

[14]  R. Douglas,et al.  Recurrent neuronal circuits in the neocortex , 2007, Current Biology.

[15]  Valentin Dragoi,et al.  Efficient coding in heterogeneous neuronal populations , 2008, Proceedings of the National Academy of Sciences.

[16]  Xiao-Jing Wang Decision Making in Recurrent Neuronal Circuits , 2008, Neuron.

[17]  L. F. Abbott,et al.  Generating Coherent Patterns of Activity from Chaotic Neural Networks , 2009, Neuron.

[18]  L. Abbott,et al.  Stimulus-dependent suppression of chaos in recurrent neural networks. , 2009, Physical review. E, Statistical, nonlinear, and soft matter physics.

[19]  E. Halgren,et al.  Spatiotemporal dynamics of neocortical excitation and inhibition during human sleep , 2012, Proceedings of the National Academy of Sciences.

[20]  W. Newsome,et al.  Context-dependent computation by recurrent dynamics in prefrontal cortex , 2013, Nature.

[21]  C. Petersen,et al.  Cell-Type-Specific Sensorimotor Processing in Striatal Projection Neurons during Goal-Directed Behavior , 2015, Neuron.

[22]  Jimmy Ba,et al.  Adam: A Method for Stochastic Optimization , 2014, ICLR.

[23]  Guangyu R. Yang,et al.  Training Excitatory-Inhibitory Recurrent Neural Networks for Cognitive Tasks: A Simple and Flexible Framework , 2016, PLoS Comput. Biol..

[24]  Christopher D. Harvey,et al.  Recurrent Network Models of Sequence Generation and Memory , 2016, Neuron.

[25]  Hilbert J. Kappen,et al.  Learning Universal Computations with Spikes , 2015, PLoS Comput. Biol..

[26]  Tobi Delbrück,et al.  Training Deep Spiking Neural Networks Using Backpropagation , 2016, Front. Neurosci..

[27]  R. Tremblay,et al.  GABAergic Interneurons in the Neocortex: From Cellular Properties to Circuits , 2016, Neuron.

[28]  Thomas Miconi,et al.  Biologically plausible learning in recurrent neural networks reproduces neural dynamics observed during cognitive tasks , 2016, bioRxiv.

[29]  Wilten Nicola,et al.  Supervised learning in spiking neural networks with FORCE training , 2016, Nature Communications.

[30]  M. Medalla,et al.  Strength and Diversity of Inhibitory Signaling Differentiates Primate Anterior Cingulate from Lateral Prefrontal Cortex , 2017, The Journal of Neuroscience.

[31]  Thomas Miconi Biologically plausible learning in recurrent neural networks reproduces neural dynamics observed during cognitive tasks , 2017 .

[32]  C. Petersen,et al.  State-dependent cell-type-specific membrane potential dynamics and unitary synaptic inputs in awake mice , 2018, eLife.

[33]  Christopher Kim,et al.  Learning recurrent dynamics in spiking networks , 2018, bioRxiv.

[34]  Devika Narain,et al.  Flexible sensorimotor computations through rapid reconfiguration of cortical dynamics , 2018 .

[35]  Francesca Mastrogiuseppe,et al.  Linking Connectivity, Dynamics, and Computations in Low-Rank Recurrent Neural Networks , 2017, Neuron.

[36]  Devika Narain,et al.  Flexible Sensorimotor Computations through Rapid Reconfiguration of Cortical Dynamics , 2018, Neuron.

[37]  Terrence J. Sejnowski,et al.  Gradient Descent for Spiking Neural Networks , 2017, NeurIPS.

[38]  Quoc V. Le,et al.  Searching for Activation Functions , 2018, arXiv.

[39]  Timothée Masquelier,et al.  Deep Learning in Spiking Neural Networks , 2018, Neural Networks.

[40]  Emre Neftci,et al.  Surrogate Gradient Learning in Spiking Neural Networks: Bringing the Power of Gradient-based optimization to spiking neural networks , 2019, IEEE Signal Processing Magazine.

[41]  Terrence J. Sejnowski,et al.  Simple framework for constructing functional spiking recurrent neural networks , 2019, Proceedings of the National Academy of Sciences.

[42]  Terrence J Sejnowski,et al.  Simple framework for constructing functional spiking recurrent neural networks , 2019, Proceedings of the National Academy of Sciences.

[43]  Kaushik Roy,et al.  Going Deeper in Spiking Neural Networks: VGG and Residual Architectures , 2018, Front. Neurosci..