Generalize Sentence Representation with Self-Inference

Authors

  • Kai-Chou Yang National Cheng Kung University
  • Hung-Yu Kao National Cheng Kung University

DOI:

https://doi.org/10.1609/aaai.v34i05.6481

Abstract

In this paper, we propose Self Inference Neural Network (SINN), a simple yet efficient sentence encoder which leverages knowledge from recurrent and convolutional neural networks. SINN gathers semantic evidence in an interaction space which is subsequently fused by a shared vector gate to determine the most relevant mixture of contextual information. We evaluate the proposed method on four benchmarks among three NLP tasks. Experimental results demonstrate that our model sets a new state-of-the-art on MultiNLI, Scitail and is competitive on the remaining two datasets over all sentence encoding methods. The encoding and inference process in our model is highly interpretable. Through visualizations of the fusion component, we open the black box of our network and explore the applicability of the base encoding methods case by case.

Downloads

Published

2020-04-03

How to Cite

Yang, K.-C., & Kao, H.-Y. (2020). Generalize Sentence Representation with Self-Inference. Proceedings of the AAAI Conference on Artificial Intelligence, 34(05), 9394-9401. https://doi.org/10.1609/aaai.v34i05.6481

Issue

Section

AAAI Technical Track: Natural Language Processing