Beyond Unfolding: Exact Recovery of Latent Convex Tensor Decomposition Under Reshuffling

Authors

  • Chao Li RIKEN
  • Mohammad Emtiyaz Khan RIKEN
  • Zhun Sun RIKEN
  • Gang Niu RIKEN
  • Bo Han RIKEN
  • Shengli Xie Guangdong University of Technology
  • Qibin Zhao RIKEN

DOI:

https://doi.org/10.1609/aaai.v34i04.5890

Abstract

Exact recovery of tensor decomposition (TD) methods is a desirable property in both unsupervised learning and scientific data analysis. The numerical defects of TD methods, however, limit their practical applications on real-world data. As an alternative, convex tensor decomposition (CTD) was proposed to alleviate these problems, but its exact-recovery property is not properly addressed so far. To this end, we focus on latent convex tensor decomposition (LCTD), a practically widely-used CTD model, and rigorously prove a sufficient condition for its exact-recovery property. Furthermore, we show that such property can be also achieved by a more general model than LCTD. In the new model, we generalize the classic tensor (un-)folding into reshuffling operation, a more flexible mapping to relocate the entries of the matrix into a tensor. Armed with the reshuffling operations and exact-recovery property, we explore a totally novel application for (generalized) LCTD, i.e., image steganography. Experimental results on synthetic data validate our theory, and results on image steganography show that our method outperforms the state-of-the-art methods.

Downloads

Published

2020-04-03

How to Cite

Li, C., Khan, M. E., Sun, Z., Niu, G., Han, B., Xie, S., & Zhao, Q. (2020). Beyond Unfolding: Exact Recovery of Latent Convex Tensor Decomposition Under Reshuffling. Proceedings of the AAAI Conference on Artificial Intelligence, 34(04), 4602-4609. https://doi.org/10.1609/aaai.v34i04.5890

Issue

Section

AAAI Technical Track: Machine Learning