Proceedings:
Learning
Volume
Issue:
Proceedings of the AAAI Conference on Artificial Intelligence, 10
Track:
Learning: Constructive and Linguistic
Downloads:
Abstract:
In order to be taken seriously, connectionist natural language processing systems must be able to parse syntactically complex sentences. Current connectionist parsers either ignore structure or impose prior restrictions on the structural complexity of the sentences they can process - either number of phrases or the "depth" of the sentence structure. XERIC networks, presented here, are distributed representation connectionist parsers which can analyze and represent syntactically varied sentences, including ones with recursive phrase structure constructs. No a priori limits are placed on the depth or length of sentences by the architecture. XERIC networks use recurrent networks to read words one at a time. RAAM-style reduced descriptions and X-Bar grammar are used to make an economical syntactic representation scheme. This is combined with a training technique which allows XERIC to use multiple, virtual copies of its RAAM decoder network to learn to parse and represent sentence structure using gradient-descent methods. XERIC networks also perform number-person disambiguation and lexical disambiguation. Results show that the networks train to a few percent error for sentences up to a phrase-nesting depth of ten or more and that this performance generalizes well.
AAAI
Proceedings of the AAAI Conference on Artificial Intelligence, 10