A Multi-Task Learning Machine Reading Comprehension Model for Noisy Document (Student Abstract)

Authors

  • Zhijing Wu Tsinghua University and BNRist
  • Hua Xu Tsinghua University and BNRist

DOI:

https://doi.org/10.1609/aaai.v34i10.7254

Abstract

Current neural models for Machine Reading Comprehension (MRC) have achieved successful performance in recent years. However, the model is too fragile and lack robustness to tackle the imperceptible adversarial perturbations to the input. In this work, we propose a multi-task learning MRC model with a hierarchical knowledge enrichment to further improve the robustness for noisy document. Our model follows a typical encode-align-decode framework. Additionally, we apply a hierarchical method of adding background knowledge into the model from coarse-to-fine to enhance the language representations. Besides, we optimize our model by jointly training the answer span and unanswerability prediction, aiming to improve the robustness to noise. Experiment results on benchmark datasets confirm the superiority of our method, and our method can achieve competitive performance compared with other strong baselines.

Downloads

Published

2020-04-03

How to Cite

Wu, Z., & Xu, H. (2020). A Multi-Task Learning Machine Reading Comprehension Model for Noisy Document (Student Abstract). Proceedings of the AAAI Conference on Artificial Intelligence, 34(10), 13963-13964. https://doi.org/10.1609/aaai.v34i10.7254

Issue

Section

Student Abstract Track