Generating Distractors for Reading Comprehension Questions from Real Examinations

Authors

  • Yifan Gao The Chinese University of Hong Kong
  • Lidong Bing Tencent AI Lab
  • Piji Li Tencent AI Lab
  • Irwin King The Chinese University of Hong Kong
  • Michael R. Lyu The Chinese University of Hong Kong

DOI:

https://doi.org/10.1609/aaai.v33i01.33016423

Abstract

We investigate the task of distractor generation for multiple choice reading comprehension questions from examinations. In contrast to all previous works, we do not aim at preparing words or short phrases distractors, instead, we endeavor to generate longer and semantic-rich distractors which are closer to distractors in real reading comprehension from examinations. Taking a reading comprehension article, a pair of question and its correct option as input, our goal is to generate several distractors which are somehow related to the answer, consistent with the semantic context of the question and have some trace in the article. We propose a hierarchical encoderdecoder framework with static and dynamic attention mechanisms to tackle this task. Specifically, the dynamic attention can combine sentence-level and word-level attention varying at each recurrent time step to generate a more readable sequence. The static attention is to modulate the dynamic attention not to focus on question irrelevant sentences or sentences which contribute to the correct option. Our proposed framework outperforms several strong baselines on the first prepared distractor generation dataset of real reading comprehension questions. For human evaluation, compared with those distractors generated by baselines, our generated distractors are more functional to confuse the annotators.

Downloads

Published

2019-07-17

How to Cite

Gao, Y., Bing, L., Li, P., King, I., & Lyu, M. R. (2019). Generating Distractors for Reading Comprehension Questions from Real Examinations. Proceedings of the AAAI Conference on Artificial Intelligence, 33(01), 6423-6430. https://doi.org/10.1609/aaai.v33i01.33016423

Issue

Section

AAAI Technical Track: Natural Language Processing