Temporal Pyramid Recurrent Neural Network

Authors

  • Qianli Ma SCUT
  • Zhenxi Lin SCUT
  • Enhuan Chen SCUT
  • Garrison Cottrell UCSD

DOI:

https://doi.org/10.1609/aaai.v34i04.5947

Abstract

Learning long-term and multi-scale dependencies in sequential data is a challenging task for recurrent neural networks (RNNs). In this paper, a novel RNN structure called temporal pyramid RNN (TP-RNN) is proposed to achieve these two goals. TP-RNN is a pyramid-like structure and generally has multiple layers. In each layer of the network, there are several sub-pyramids connected by a shortcut path to the output, which can efficiently aggregate historical information from hidden states and provide many gradient feedback short-paths. This avoids back-propagating through many hidden states as in usual RNNs. In particular, in the multi-layer structure of TP-RNN, the input sequence of the higher layer is a large-scale aggregated state sequence produced by the sub-pyramids in the previous layer, instead of the usual sequence of hidden states. In this way, TP-RNN can explicitly learn multi-scale dependencies with multi-scale input sequences of different layers, and shorten the input sequence and gradient feedback paths of each layer. This avoids the vanishing gradient problem in deep RNNs and allows the network to efficiently learn long-term dependencies. We evaluate TP-RNN on several sequence modeling tasks, including the masked addition problem, pixel-by-pixel image classification, signal recognition and speaker identification. Experimental results demonstrate that TP-RNN consistently outperforms existing RNNs for learning long-term and multi-scale dependencies in sequential data.

Downloads

Published

2020-04-03

How to Cite

Ma, Q., Lin, Z., Chen, E., & Cottrell, G. (2020). Temporal Pyramid Recurrent Neural Network. Proceedings of the AAAI Conference on Artificial Intelligence, 34(04), 5061-5068. https://doi.org/10.1609/aaai.v34i04.5947

Issue

Section

AAAI Technical Track: Machine Learning