Document Summarization with VHTM: Variational Hierarchical Topic-Aware Mechanism

Authors

  • Xiyan Fu Nankai University
  • Jun Wang Ludong University
  • Jinghan Zhang Nankai University
  • Jinmao Wei Nankai University
  • Zhenglu Yang Nankai University

DOI:

https://doi.org/10.1609/aaai.v34i05.6277

Abstract

Automatic text summarization focuses on distilling summary information from texts. This research field has been considerably explored over the past decades because of its significant role in many natural language processing tasks; however, two challenging issues block its further development: (1) how to yield a summarization model embedding topic inference rather than extending with a pre-trained one and (2) how to merge the latent topics into diverse granularity levels. In this study, we propose a variational hierarchical model to holistically address both issues, dubbed VHTM. Different from the previous work assisted by a pre-trained single-grained topic model, VHTM is the first attempt to jointly accomplish summarization with topic inference via variational encoder-decoder and merge topics into multi-grained levels through topic embedding and attention. Comprehensive experiments validate the superior performance of VHTM compared with the baselines, accompanying with semantically consistent topics.

Downloads

Published

2020-04-03

How to Cite

Fu, X., Wang, J., Zhang, J., Wei, J., & Yang, Z. (2020). Document Summarization with VHTM: Variational Hierarchical Topic-Aware Mechanism. Proceedings of the AAAI Conference on Artificial Intelligence, 34(05), 7740-7747. https://doi.org/10.1609/aaai.v34i05.6277

Issue

Section

AAAI Technical Track: Natural Language Processing