Improving Knowledge-Aware Dialogue Generation via Knowledge Base Question Answering

  • Jian Wang South China University of Technology
  • Junhao Liu Chinese Academy of Sciences
  • Wei Bi Tencent AI Lab
  • Xiaojiang Liu Tencent AI Lab
  • Kejing He South China University of Technology
  • Ruifeng Xu Harbin Institute of Technology (Shenzhen)
  • Min Yang Chinese Academy of Sciences

Abstract

Neural network models usually suffer from the challenge of incorporating commonsense knowledge into the open-domain dialogue systems. In this paper, we propose a novel knowledge-aware dialogue generation model (called TransDG), which transfers question representation and knowledge matching abilities from knowledge base question answering (KBQA) task to facilitate the utterance understanding and factual knowledge selection for dialogue generation. In addition, we propose a response guiding attention and a multi-step decoding strategy to steer our model to focus on relevant features for response generation. Experiments on two benchmark datasets demonstrate that our model has robust superiority over compared methods in generating informative and fluent dialogues. Our code is available at https://github.com/siat-nlp/TransDG.

Published
2020-04-03
Section
AAAI Technical Track: Natural Language Processing