Joint Parsing and Generation for Abstractive Summarization

  • Kaiqiang Song University of Central Florida
  • Logan Lebanoff University of Central Florida
  • Qipeng Guo Fudan University
  • Xipeng Qiu Fudan University
  • Xiangyang Xue Fudan University
  • Chen Li Tencent AI Lab
  • Dong Yu Tencent AI Lab
  • Fei Liu University of Central Florida

Abstract

Sentences produced by abstractive summarization systems can be ungrammatical and fail to preserve the original meanings, despite being locally fluent. In this paper we propose to remedy this problem by jointly generating a sentence and its syntactic dependency parse while performing abstraction. If generating a word can introduce an erroneous relation to the summary, the behavior must be discouraged. The proposed method thus holds promise for producing grammatical sentences and encouraging the summary to stay true-to-original. Our contributions of this work are twofold. First, we present a novel neural architecture for abstractive summarization that combines a sequential decoder with a tree-based decoder in a synchronized manner to generate a summary sentence and its syntactic parse. Secondly, we describe a novel human evaluation protocol to assess if, and to what extent, a summary remains true to its original meanings. We evaluate our method on a number of summarization datasets and demonstrate competitive results against strong baselines.

Published
2020-04-03
Section
AAAI Technical Track: Natural Language Processing