Bivariate Beta-LSTM

Authors

  • Kyungwoo Song KAIST
  • JoonHo Jang KAIST
  • Seung jae Shin KAIST
  • Il-Chul Moon KAIST

DOI:

https://doi.org/10.1609/aaai.v34i04.6039

Abstract

Long Short-Term Memory (LSTM) infers the long term dependency through a cell state maintained by the input and the forget gate structures, which models a gate output as a value in [0,1] through a sigmoid function. However, due to the graduality of the sigmoid function, the sigmoid gate is not flexible in representing multi-modality or skewness. Besides, the previous models lack modeling on the correlation between the gates, which would be a new method to adopt inductive bias for a relationship between previous and current input. This paper proposes a new gate structure with the bivariate Beta distribution. The proposed gate structure enables probabilistic modeling on the gates within the LSTM cell so that the modelers can customize the cell state flow with priors and distributions. Moreover, we theoretically show the higher upper bound of the gradient compared to the sigmoid function, and we empirically observed that the bivariate Beta distribution gate structure provides higher gradient values in training. We demonstrate the effectiveness of the bivariate Beta gate structure on the sentence classification, image classification, polyphonic music modeling, and image caption generation.

Downloads

Published

2020-04-03

How to Cite

Song, K., Jang, J., Shin, S. jae, & Moon, I.-C. (2020). Bivariate Beta-LSTM. Proceedings of the AAAI Conference on Artificial Intelligence, 34(04), 5818-5825. https://doi.org/10.1609/aaai.v34i04.6039

Issue

Section

AAAI Technical Track: Machine Learning