Joint Parsing and Generation for Abstractive Summarization

Authors

  • Kaiqiang Song University of Central Florida
  • Logan Lebanoff University of Central Florida
  • Qipeng Guo Fudan University
  • Xipeng Qiu Fudan University
  • Xiangyang Xue Fudan University
  • Chen Li Tencent AI Lab
  • Dong Yu Tencent AI Lab
  • Fei Liu University of Central Florida

DOI:

https://doi.org/10.1609/aaai.v34i05.6419

Abstract

Sentences produced by abstractive summarization systems can be ungrammatical and fail to preserve the original meanings, despite being locally fluent. In this paper we propose to remedy this problem by jointly generating a sentence and its syntactic dependency parse while performing abstraction. If generating a word can introduce an erroneous relation to the summary, the behavior must be discouraged. The proposed method thus holds promise for producing grammatical sentences and encouraging the summary to stay true-to-original. Our contributions of this work are twofold. First, we present a novel neural architecture for abstractive summarization that combines a sequential decoder with a tree-based decoder in a synchronized manner to generate a summary sentence and its syntactic parse. Secondly, we describe a novel human evaluation protocol to assess if, and to what extent, a summary remains true to its original meanings. We evaluate our method on a number of summarization datasets and demonstrate competitive results against strong baselines.

Downloads

Published

2020-04-03

How to Cite

Song, K., Lebanoff, L., Guo, Q., Qiu, X., Xue, X., Li, C., Yu, D., & Liu, F. (2020). Joint Parsing and Generation for Abstractive Summarization. Proceedings of the AAAI Conference on Artificial Intelligence, 34(05), 8894-8901. https://doi.org/10.1609/aaai.v34i05.6419

Issue

Section

AAAI Technical Track: Natural Language Processing