TreeGen: A Tree-Based Transformer Architecture for Code Generation

Authors

  • Zeyu Sun Peking University
  • Qihao Zhu Peking University
  • Yingfei Xiong Peking University
  • Yican Sun Peking University
  • Lili Mou University of Alberta
  • Lu Zhang Peking University

DOI:

https://doi.org/10.1609/aaai.v34i05.6430

Abstract

A code generation system generates programming language code based on an input natural language description. State-of-the-art approaches rely on neural networks for code generation. However, these code generators suffer from two problems. One is the long dependency problem, where a code element often depends on another far-away code element. A variable reference, for example, depends on its definition, which may appear quite a few lines before. The other problem is structure modeling, as programs contain rich structural information. In this paper, we propose a novel tree-based neural architecture, TreeGen, for code generation. TreeGen uses the attention mechanism of Transformers to alleviate the long-dependency problem, and introduces a novel AST reader (encoder) to incorporate grammar rules and AST structures into the network. We evaluated TreeGen on a Python benchmark, HearthStone, and two semantic parsing benchmarks, ATIS and GEO. TreeGen outperformed the previous state-of-the-art approach by 4.5 percentage points on HearthStone, and achieved the best accuracy among neural network-based approaches on ATIS (89.1%) and GEO (89.6%). We also conducted an ablation test to better understand each component of our model.

Downloads

Published

2020-04-03

How to Cite

Sun, Z., Zhu, Q., Xiong, Y., Sun, Y., Mou, L., & Zhang, L. (2020). TreeGen: A Tree-Based Transformer Architecture for Code Generation. Proceedings of the AAAI Conference on Artificial Intelligence, 34(05), 8984-8991. https://doi.org/10.1609/aaai.v34i05.6430

Issue

Section

AAAI Technical Track: Natural Language Processing