Divide-and-Conquer Learning with Nyström: Optimal Rate and Algorithm

Authors

  • Rong Yin Chinese Academy of Sciences
  • Yong Liu Chinese Academy of Sciences
  • Lijing Lu Chinese Academy of Sciences
  • Weiping Wang Chinese Academy of Sciences
  • Dan Meng Chinese Academy of Sciences

DOI:

https://doi.org/10.1609/aaai.v34i04.6147

Abstract

Kernel Regularized Least Squares (KRLS) is a fundamental learner in machine learning. However, due to the high time and space requirements, it has no capability to large scale scenarios. Therefore, we propose DC-NY, a novel algorithm that combines divide-and-conquer method, Nyström, conjugate gradient, and preconditioning to scale up KRLS, has the same accuracy of exact KRLS and the minimum time and space complexity compared to the state-of-the-art approximate KRLS estimates. We present a theoretical analysis of DC-NY, including a novel error decomposition with the optimal statistical accuracy guarantees. Extensive experimental results on several real-world large-scale datasets containing up to 1M data points show that DC-NY significantly outperforms the state-of-the-art approximate KRLS estimates.

Downloads

Published

2020-04-03

How to Cite

Yin, R., Liu, Y., Lu, L., Wang, W., & Meng, D. (2020). Divide-and-Conquer Learning with Nyström: Optimal Rate and Algorithm. Proceedings of the AAAI Conference on Artificial Intelligence, 34(04), 6696-6703. https://doi.org/10.1609/aaai.v34i04.6147

Issue

Section

AAAI Technical Track: Machine Learning