Variational Inference for Sparse Gaussian Process Modulated Hawkes Process

Authors

  • Rui Zhang Australian National University
  • Christian Walder Australian National University
  • Marian-Andrei Rizoiu University of Technology Sydney

DOI:

https://doi.org/10.1609/aaai.v34i04.6160

Abstract

The Hawkes process (HP) has been widely applied to modeling self-exciting events including neuron spikes, earthquakes and tweets. To avoid designing parametric triggering kernel and to be able to quantify the prediction confidence, the non-parametric Bayesian HP has been proposed. However, the inference of such models suffers from unscalability or slow convergence. In this paper, we aim to solve both problems. Specifically, first, we propose a new non-parametric Bayesian HP in which the triggering kernel is modeled as a squared sparse Gaussian process. Then, we propose a novel variational inference schema for model optimization. We employ the branching structure of the HP so that maximization of evidence lower bound (ELBO) is tractable by the expectation-maximization algorithm. We propose a tighter ELBO which improves the fitting performance. Further, we accelerate the novel variational inference schema to linear time complexity by leveraging the stationarity of the triggering kernel. Different from prior acceleration methods, ours enjoys higher efficiency. Finally, we exploit synthetic data and two large social media datasets to evaluate our method. We show that our approach outperforms state-of-the-art non-parametric frequentist and Bayesian methods. We validate the efficiency of our accelerated variational inference schema and practical utility of our tighter ELBO for model selection. We observe that the tighter ELBO exceeds the common one in model selection.

Downloads

Published

2020-04-03

How to Cite

Zhang, R., Walder, C., & Rizoiu, M.-A. (2020). Variational Inference for Sparse Gaussian Process Modulated Hawkes Process. Proceedings of the AAAI Conference on Artificial Intelligence, 34(04), 6803-6810. https://doi.org/10.1609/aaai.v34i04.6160

Issue

Section

AAAI Technical Track: Machine Learning