Neural Machine Translation with Joint Representation

Authors

  • Yanyang Li Northeastern University
  • Qiang Wang Northeastern University
  • Tong Xiao Northeastern University
  • Tongran Liu CAS
  • Jingbo Zhu Northeastern University

DOI:

https://doi.org/10.1609/aaai.v34i05.6344

Abstract

Though early successes of Statistical Machine Translation (SMT) systems are attributed in part to the explicit modelling of the interaction between any two source and target units, e.g., alignment, the recent Neural Machine Translation (NMT) systems resort to the attention which partially encodes the interaction for efficiency. In this paper, we employ Joint Representation that fully accounts for each possible interaction. We sidestep the inefficiency issue by refining representations with the proposed efficient attention operation. The resulting Reformer models offer a new Sequence-to-Sequence modelling paradigm besides the Encoder-Decoder framework and outperform the Transformer baseline in either the small scale IWSLT14 German-English, English-German and IWSLT15 Vietnamese-English or the large scale NIST12 Chinese-English translation tasks by about 1 BLEU point. We also propose a systematic model scaling approach, allowing the Reformer model to beat the state-of-the-art Transformer in IWSLT14 German-English and NIST12 Chinese-English with about 50% fewer parameters. The code is publicly available at https://github.com/lyy1994/reformer.

Downloads

Published

2020-04-03

How to Cite

Li, Y., Wang, Q., Xiao, T., Liu, T., & Zhu, J. (2020). Neural Machine Translation with Joint Representation. Proceedings of the AAAI Conference on Artificial Intelligence, 34(05), 8285-8292. https://doi.org/10.1609/aaai.v34i05.6344

Issue

Section

AAAI Technical Track: Natural Language Processing