Proceedings:
No. 1: AAAI-19, IAAI-19, EAAI-20
Volume
Issue:
Proceedings of the AAAI Conference on Artificial Intelligence, 33
Track:
AAAI Technical Track: Machine Learning
Downloads:
Abstract:
We show how to solve a number of problems in numerical linear algebra, such as least squares regression, lp-regression for any p ≥ 1, low rank approximation, and kernel regression, in time T(A)poly(log(nd)), where for a given input matrix A ∈ Rn×d, T(A) is the time needed to compute A · y for an arbitrary vector y ∈ Rd. Since T(A) ≤ O(nnz(A)), where nnz(A) denotes the number of non-zero entries of A, the time is no worse, up to polylogarithmic factors, as all of the recent advances for such problems that run in input-sparsity time. However, for many applications, T(A) can be much smaller than nnz(A), yielding significantly sublinear time algorithms. For example, in the overconstrained (1+ε)-approximate polynomial interpolation problem, A is a Vandermonde matrix and T(A) = O(n log n); in this case our running time is n · poly (log n) + poly (d/ε) and we recover the results of Avron, Sindhwani, and Woodruff (2013) as a special case. For overconstrained autoregression, which is a common problem arising in dynamical systems, T(A) = O(n log n), and we immediately obtain n· poly (log n) + poly(d/ε) time. For kernel autoregression, we significantly improve the running time of prior algorithms for general kernels. For the important case of autoregression with the polynomial kernel and arbitrary target vector b ∈ Rn, we obtain even faster algorithms. Our algorithms show that, perhaps surprisingly, most of these optimization problems do not require much more time than that of a polylogarithmic number of matrix-vector multiplications.
DOI:
10.1609/aaai.v33i01.33014918
AAAI
Proceedings of the AAAI Conference on Artificial Intelligence, 33