Incremental parsing in a continuous dynamical system: sentence processing in Gradient Symbolic Computation

Abstract Any incremental parser must solve two computational problems: (1) maintaining all interpretations consistent with the words that have been processed so far and (2) excluding all globally-incoherent interpretations. While these problems are well understood, it is not clear how the dynamic, continuous mechanisms that underlie human language processing solve them. We introduce a Gradient Symbolic Computation (GSC) parser, a continuous-state, continuous-time stochastic dynamical-system model of symbolic processing, which builds up a discrete symbolic structure gradually by dynamically strengthening a discreteness constraint. Online, interactive tutorials with open-source software are presented in a companion website. Our results reveal that the GSC parser can solve the two computational problems by moving to a non-discrete blend state that evolves exclusively to discrete states representing contextually-appropriate globally-coherent interpretations. In a simulation study using a simple formal grammar, we show that successful parsing requires appropriate control of the discreteness constraint strength (a quantization policy). With inappropriate quantization policies, the GSC parser makes mistakes that mirror those made in natural language comprehension (garden-path or local-coherence errors). These findings suggest that the GSC model offers a neurally plausible solution to these two core problems.

[1]  Edward Gibson,et al.  The Interaction of Top-Down and Bottom-Up Statistics in the Resolution of Syntactic Category Ambiguity. , 2006 .

[2]  Daniel C. Richardson,et al.  Effects of merely local syntactic coherence on sentence processing , 2004 .

[3]  Christina M. Krause The Psychological Reality of Local Coherences in Sentence Processing , 2005 .

[4]  Michael K. Tanenhaus,et al.  Parsing in a Dynamical System: An Attractor-based Account of the Interaction of Lexical and Structural Constraints in Sentence Processing , 1997 .

[5]  R. Levy Expectation-based syntactic comprehension , 2008, Cognition.

[6]  K. Rayner,et al.  Eye movement evidence that readers maintain and act on uncertainty about past linguistic input , 2009, Proceedings of the National Academy of Sciences.

[7]  Richard L. Lewis,et al.  Computational principles of working memory in sentence comprehension , 2006, Trends in Cognitive Sciences.

[8]  R. Weale Vision. A Computational Investigation Into the Human Representation and Processing of Visual Information. David Marr , 1983 .

[9]  Serafim Rodrigues,et al.  A modular architecture for transparent computation in Recurrent Neural Networks , 2016, Neural Networks.

[10]  C. F. Hockett The origin of speech. , 1960, Scientific American.

[11]  Geoffrey E. Hinton Tensor Product Variable Binding and the Representation of Symbolic Structures in Connectionist Systems , 1991 .

[12]  Daniel Jurafsky,et al.  A Probabilistic Model of Lexical and Syntactic Access and Disambiguation , 1996, Cogn. Sci..

[13]  Robert R. Snapp,et al.  The Garden of Forking Paths , 1941 .

[14]  R. Paget The Origin of Speech , 1927, Nature.

[15]  John Hale,et al.  A Probabilistic Earley Parser as a Psycholinguistic Model , 2001, NAACL.

[16]  Pyeong Whan Cho,et al.  Bifurcation analysis of a Gradient Symbolic Computation model of incremental processing , 2016, CogSci.

[17]  Matthew Goldrick,et al.  Optimization and Quantization in Gradient Symbol Systems: A Framework for Integrating the Continuous and the Discrete in Cognition , 2014, Cogn. Sci..

[18]  Donald Geman,et al.  Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images , 1984 .

[19]  W. Tabor,et al.  Evidence for self-organized sentence processing: digging-in effects. , 2004, Journal of experimental psychology. Learning, memory, and cognition.

[20]  Lyn Frazier,et al.  Sentence processing: A tutorial review. , 1987 .

[21]  Peter beim Graben,et al.  Geometric Representations for Minimalist Grammars , 2011, Journal of Logic, Language and Information.

[22]  Donald Geman,et al.  Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images , 1984, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[23]  Géraldine Legendre,et al.  The Harmonic Mind: From Neural Computation to Optimality-Theoretic GrammarVolume I: Cognitive Architecture (Bradford Books) , 2006 .

[24]  Shravan Vasishth,et al.  Towards dynamical system models of language-related brain potentials , 2008, Cognitive Neurodynamics.

[25]  M. Tanenhaus Afterword The impact of “The cognitive basis for linguistic structures” , 2013 .

[26]  G. Kempen,et al.  Syntactic structure assembly in human parsing: a computational model based on competitive inhibition and a lexicalist grammar , 2000, Cognition.

[27]  John T. Hale,et al.  What a Rational Parser Would Do , 2011, Cogn. Sci..

[28]  G. Altmann,et al.  Incremental interpretation at verbs: restricting the domain of subsequent reference , 1999, Cognition.

[29]  Thomas L. Griffiths,et al.  Modeling the effects of memory on human online sentence processing with particle filters , 2008, NIPS.

[30]  Jeffrey L. Elman,et al.  Finding Structure in Time , 1990, Cogn. Sci..