Monolingual Transfer Learning via Bilingual Translators for Style-Sensitive Paraphrase Generation

  • Tomoyuki Kajiwara Osaka University
  • Biwa Miura AI Samurai Inc.
  • Yuki Arase Osaka University

Abstract

We tackle the low-resource problem in style transfer by employing transfer learning that utilizes abundantly available raw corpora. Our method consists of two steps: pre-training learns to generate a semantically equivalent sentence with an input assured grammaticality, and fine-tuning learns to add a desired style. Pre-training has two options, auto-encoding and machine translation based methods. Pre-training based on AutoEncoder is a simple way to learn these from a raw corpus. If machine translators are available, the model can learn more diverse paraphrasing via roundtrip translation. After these, fine-tuning achieves high-quality paraphrase generation even in situations where only 1k sentence pairs of the parallel corpus for style transfer is available. Experimental results of formality style transfer indicated the effectiveness of both pre-training methods and the method based on roundtrip translation achieves state-of-the-art performance.

Published
2020-04-03
Section
AAAI Technical Track: Natural Language Processing