Simultaneous Learning of Pivots and Representations for Cross-Domain Sentiment Classification

  • Liang Li Tsinghua University
  • Weirui Ye School of Software, BNRist, Tsinghua University
  • Mingsheng Long Tsinghua University
  • Yateng Tang Tencent Inc.
  • Jin Xu Tencent Inc.
  • Jianmin Wang Tsinghua University

Abstract

Cross-domain sentiment classification aims to leverage useful knowledge from a source domain to mitigate the supervision sparsity in a target domain. A series of approaches depend on the pivot features that behave similarly for polarity prediction in both domains. However, the engineering of such pivot features remains cumbersome and prevents us from learning the disentangled and transferable representations from rich semantic and syntactic information. Towards learning the pivots and representations simultaneously, we propose a new Transferable Pivot Transformer (TPT). Our model consists of two networks: a Pivot Selector that learns to detect transferable n-gram pivots from contexts, and a Transferable Transformer that learns to generate domain-invariant representations by modeling the correlation between pivot and non-pivot words. The Pivot Selector and Transferable Transformer are jointly optimized through end-to-end back-propagation. We experiment with real tasks of cross-domain sentiment classification over 20 domain pairs where our model outperforms prior arts.

Published
2020-04-03
Section
AAAI Technical Track: Natural Language Processing