SARDSRN: A Neural Network Shift-Reduce Parser

Simple Recurrent Networks (SRNs) have been widely used in natural language tasks. SARDSRN extends the SRN by explicitly representing the input sequence in a SARDNET self-organizing map. The distributed SRN component leads to good generalization and robust cognitive properties, whereas the SARDNET map provides exact representations of the sentence constituents. This combination allows SARDSRN to learn to parse sentences with more complicated structure than can the SRN alone, and suggests that the approach could scale up to realistic natural language.

[1]  Risto Miikkulainen,et al.  Subsymbolic Case-Role Analysis of Sentences With Embedded Clauses , 1993, Cogn. Sci..

[2]  Jeffrey L. Elman,et al.  Finding Structure in Time , 1990, Cogn. Sci..

[3]  Andreas Stolcke Learning Feature-based Semantics with Simple Recurrent Networks , 1990 .

[4]  James L. McClelland,et al.  Learning and Applying Contextual Constraints in Sentence Comprehension , 1990, Artif. Intell..

[5]  Risto Miikkulainen,et al.  Combining Maps and Distributed Representations for Shift-Reduce Parsing , 1998, Hybrid Neural Systems.

[6]  Raymond J. Mooney,et al.  Comparative results on using inductive logic programming for corpus-based parser construction , 1995, Learning for Natural Language Processing.

[7]  Mitchell P. Marcus,et al.  A theory of syntactic recognition for natural language , 1979 .

[8]  Jordan B. Pollack,et al.  Recursive Distributed Representations , 1990, Artif. Intell..

[9]  Risto Miikkulainen,et al.  SARDNET: A Self-Organizing Feature Map for Sequences , 1994, NIPS.

[10]  James L. McClelland,et al.  Mechanisms of Sentence Processing: Assigning Roles to Constituents of Sentences , 1986 .

[11]  Teuvo Kohonen,et al.  Self-Organizing Maps , 2010 .

[12]  Raymond J. Mooney,et al.  Learning Semantic Grammars with Constructive Inductive Logic Programming , 1993, AAAI.

[13]  Jeffrey L. Elman,et al.  Distributed Representations, Simple Recurrent Networks, and Grammatical Structure , 1991, Mach. Learn..

[14]  David C. Plaut,et al.  Connectionist neuropsychology: the breakdown and recovery of behavior in lesioned attractor networks , 1992 .

[15]  George Berg,et al.  A Connectionist Parser with Recursive Sentence Structure and Lexical Disambiguation , 1992, AAAI.

[16]  Robert B. Allen,et al.  Several Studies on Natural Language ·and Back-Propagation , 1987 .

[17]  Raymond J. Mooney,et al.  Learning Parse and Translation Decisions from Examples with Rich Context , 1997, ACL.

[18]  Robert F. Simmons,et al.  The Acquisition and Use of Context-Dependent Grammars for English , 1992, Comput. Linguistics.

[19]  Risto Miikkulainen,et al.  Subsymbolic natural language processing - an integrated model of scripts, lexicon, and memory , 1993, Neural network modeling and connectionism.

[20]  T. Shallice,et al.  Perseverative and Semantic Influences on Visual Object Naming Errors in Optic Aphasia: A Connectionist Account , 1993, Journal of Cognitive Neuroscience.

[21]  Ivan A. Sag,et al.  Book Reviews: Head-driven Phrase Structure Grammar and German in Head-driven Phrase-structure Grammar , 1996, CL.

[22]  David J. Chalmers,et al.  Syntactic Transformations on Distributed Representations , 1990 .

[23]  R. Miikkulainen Dyslexic and Category-Specific Aphasic Impairments in a Self-Organizing Feature Map Model of the Lexicon , 1997, Brain and Language.

[24]  Teuvo Kohonen,et al.  The self-organizing map , 1990, Neurocomputing.

[25]  David S. Touretzky Connectionism and Compositional Semantics , 1989 .

[26]  Paul W. Munro,et al.  A Network for Encoding, Decoding and Translating Locative Prepositions , 1991 .

[27]  Masaru Tomita,et al.  Efficient Parsing for Natural Language: A Fast Algorithm for Practical Systems , 1985 .

[28]  Masaru Tomita,et al.  Efficient parsing for natural language , 1985 .

[29]  Robert F. Simmons,et al.  The Acquisition and Application of Context Sensitive Grammar for English , 1991, ACL.