Dynamic Compositionality in Recursive Neural Networks with Structure-Aware Tag Representations

  • Taeuk Kim Seoul National University
  • Jihun Choi Seoul National University
  • Daniel Edmiston University of Chicago
  • Sanghwan Bae Seoul National University
  • Sang-goo Lee Seoul National University

Abstract

Most existing recursive neural network (RvNN) architectures utilize only the structure of parse trees, ignoring syntactic tags which are provided as by-products of parsing. We present a novel RvNN architecture that can provide dynamic compositionality by considering comprehensive syntactic information derived from both the structure and linguistic tags. Specifically, we introduce a structure-aware tag representation constructed by a separate tag-level tree-LSTM. With this, we can control the composition function of the existing wordlevel tree-LSTM by augmenting the representation as a supplementary input to the gate functions of the tree-LSTM. In extensive experiments, we show that models built upon the proposed architecture obtain superior or competitive performance on several sentence-level tasks such as sentiment analysis and natural language inference when compared against previous tree-structured models and other sophisticated neural models.

Published
2019-07-17
Section
AAAI Technical Track: Natural Language Processing