Neural Simile Recognition with Cyclic Multitask Learning and Local Attention

Authors

  • Jiali Zeng Xiamen University
  • Linfeng Song Tencent
  • Jinsong Su Xiamen University
  • Jun Xie Tencent
  • Wei Song Capital Normal University
  • Jiebo Luo University of Rochester

DOI:

https://doi.org/10.1609/aaai.v34i05.6496

Abstract

Simile recognition is to detect simile sentences and to extract simile components, i.e., tenors and vehicles. It involves two subtasks: simile sentence classification and simile component extraction. Recent work has shown that standard multitask learning is effective for Chinese simile recognition, but it is still uncertain whether the mutual effects between the subtasks have been well captured by simple parameter sharing. We propose a novel cyclic multitask learning framework for neural simile recognition, which stacks the subtasks and makes them into a loop by connecting the last to the first. It iteratively performs each subtask, taking the outputs of the previous subtask as additional inputs to the current one, so that the interdependence between the subtasks can be better explored. Extensive experiments show that our framework significantly outperforms the current state-of-the-art model and our carefully designed baselines, and the gains are still remarkable using BERT. Source Code of this paper are available on https://github.com/DeepLearnXMU/Cyclic.

Downloads

Published

2020-04-03

How to Cite

Zeng, J., Song, L., Su, J., Xie, J., Song, W., & Luo, J. (2020). Neural Simile Recognition with Cyclic Multitask Learning and Local Attention. Proceedings of the AAAI Conference on Artificial Intelligence, 34(05), 9515-9522. https://doi.org/10.1609/aaai.v34i05.6496

Issue

Section

AAAI Technical Track: Natural Language Processing