Dependency Grammar Induction with a Neural Variational Transition-Based Parser

  • Bowen Li University of Edinburgh
  • Jianpeng Cheng University of Edinburgh
  • Yang Liu University of Edinburgh
  • Frank Keller University of Edinburgh

Abstract

Dependency grammar induction is the task of learning dependency syntax without annotated training data. Traditional graph-based models with global inference achieve state-ofthe-art results on this task but they require O(n3) run time. Transition-based models enable faster inference with O(n) time complexity, but their performance still lags behind. In this work, we propose a neural transition-based parser for dependency grammar induction, whose inference procedure utilizes rich neural features with O(n) time complexity. We train the parser with an integration of variational inference, posterior regularization and variance reduction techniques. The resulting framework outperforms previous unsupervised transition-based dependency parsers and achieves performance comparable to graph-based models, both on the English Penn Treebank and on the Universal Dependency Treebank. In an empirical comparison, we show that our approach substantially increases parsing speed over graphbased models.

Published
2019-07-17
Section
AAAI Technical Track: Natural Language Processing