Monolingual Transfer Learning via Bilingual Translators for Style-Sensitive Paraphrase Generation

Authors

  • Tomoyuki Kajiwara Osaka University
  • Biwa Miura AI Samurai Inc.
  • Yuki Arase Osaka University

DOI:

https://doi.org/10.1609/aaai.v34i05.6314

Abstract

We tackle the low-resource problem in style transfer by employing transfer learning that utilizes abundantly available raw corpora. Our method consists of two steps: pre-training learns to generate a semantically equivalent sentence with an input assured grammaticality, and fine-tuning learns to add a desired style. Pre-training has two options, auto-encoding and machine translation based methods. Pre-training based on AutoEncoder is a simple way to learn these from a raw corpus. If machine translators are available, the model can learn more diverse paraphrasing via roundtrip translation. After these, fine-tuning achieves high-quality paraphrase generation even in situations where only 1k sentence pairs of the parallel corpus for style transfer is available. Experimental results of formality style transfer indicated the effectiveness of both pre-training methods and the method based on roundtrip translation achieves state-of-the-art performance.

Downloads

Published

2020-04-03

How to Cite

Kajiwara, T., Miura, B., & Arase, Y. (2020). Monolingual Transfer Learning via Bilingual Translators for Style-Sensitive Paraphrase Generation. Proceedings of the AAAI Conference on Artificial Intelligence, 34(05), 8042-8049. https://doi.org/10.1609/aaai.v34i05.6314

Issue

Section

AAAI Technical Track: Natural Language Processing