Polynomial Matrix Completion for Missing Data Imputation and Transductive Learning

Authors

  • Jicong Fan Cornell University
  • Yuqian Zhang Cornell University
  • Madeleine Udell Cornell University

DOI:

https://doi.org/10.1609/aaai.v34i04.5796

Abstract

This paper develops new methods to recover the missing entries of a high-rank or even full-rank matrix when the intrinsic dimension of the data is low compared to the ambient dimension. Specifically, we assume that the columns of a matrix are generated by polynomials acting on a low-dimensional intrinsic variable, and wish to recover the missing entries under this assumption. We show that we can identify the complete matrix of minimum intrinsic dimension by minimizing the rank of the matrix in a high dimensional feature space. We develop a new formulation of the resulting problem using the kernel trick together with a new relaxation of the rank objective, and propose an efficient optimization method. We also show how to use our methods to complete data drawn from multiple nonlinear manifolds. Comparative studies on synthetic data, subspace clustering with missing data, motion capture data recovery, and transductive learning verify the superiority of our methods over the state-of-the-art.

Downloads

Published

2020-04-03

How to Cite

Fan, J., Zhang, Y., & Udell, M. (2020). Polynomial Matrix Completion for Missing Data Imputation and Transductive Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 34(04), 3842-3849. https://doi.org/10.1609/aaai.v34i04.5796

Issue

Section

AAAI Technical Track: Machine Learning