Differentiable Algorithm for Marginalising Changepoints

Authors

  • Hyoungjin Lim KAIST
  • Gwonsoo Che KAIST
  • Wonyeol Lee KAIST
  • Hongseok Yang KAIST

DOI:

https://doi.org/10.1609/aaai.v34i04.5918

Abstract

We present an algorithm for marginalising changepoints in time-series models that assume a fixed number of unknown changepoints. Our algorithm is differentiable with respect to its inputs, which are the values of latent random variables other than changepoints. Also, it runs in time O(mn) where n is the number of time steps and m the number of changepoints, an improvement over a naive marginalisation method with O(nm) time complexity. We derive the algorithm by identifying quantities related to this marginalisation problem, showing that these quantities satisfy recursive relationships, and transforming the relationships to an algorithm via dynamic programming. Since our algorithm is differentiable, it can be applied to convert a model non-differentiable due to changepoints to a differentiable one, so that the resulting models can be analysed using gradient-based inference or learning techniques. We empirically show the effectiveness of our algorithm in this application by tackling the posterior inference problem on synthetic and real-world data.

Downloads

Published

2020-04-03

How to Cite

Lim, H., Che, G., Lee, W., & Yang, H. (2020). Differentiable Algorithm for Marginalising Changepoints. Proceedings of the AAAI Conference on Artificial Intelligence, 34(04), 4828-4835. https://doi.org/10.1609/aaai.v34i04.5918

Issue

Section

AAAI Technical Track: Machine Learning