Proceedings:
No. 1: AAAI-19, IAAI-19, EAAI-20
Volume
Issue:
Proceedings of the AAAI Conference on Artificial Intelligence, 33
Track:
AAAI Technical Track: AI and the Web
Downloads:
Abstract:
This paper proposes a novel method for generating compact answers to open-domain why-questions, such as the following answer, “Because deep learning technologies were introduced,” to the question, “Why did Google’s machine translation service improve so drastically?” Although many works have dealt with why-question answering, most have focused on retrieving as answers relatively long text passages that consist of several sentences. Because of their length, such passages are not appropriate to be read aloud by spoken dialog systems and smart speakers; hence, we need to create a method that generates compact answers. We developed a novel neural summarizer for this compact answer generation task. It combines a recurrent neural network-based encoderdecoder model with stacked convolutional neural networks and was designed to effectively exploit background knowledge, in this case a set of causal relations (e.g., “[Microsoft’s machine translation has made great progress over the last few years]effect since [it started to use deep learning.]cause”) that was extracted from a large web data archive (4 billion web pages). Our experimental results show that our method achieved significantly better ROUGE F-scores than existing encoder-decoder models and their variations that were augmented with query-attention and memory networks, which are used to exploit the background knowledge.
DOI:
10.1609/aaai.v33i01.3301142
AAAI
Proceedings of the AAAI Conference on Artificial Intelligence, 33