Projective Quadratic Regression for Online Learning

Authors

  • Wenye Ma Tencent

DOI:

https://doi.org/10.1609/aaai.v34i04.5951

Abstract

This paper considers online convex optimization (OCO) problems - the paramount framework for online learning algorithm design. The loss function of learning task in OCO setting is based on streaming data so that OCO is a powerful tool to model large scale applications such as online recommender systems. Meanwhile, real-world data are usually of extreme high-dimensional due to modern feature engineering techniques so that the quadratic regression is impractical. Factorization Machine as well as its variants are efficient models for capturing feature interactions with low-rank matrix model but they can't fulfill the OCO setting due to their non-convexity. In this paper, We propose a projective quadratic regression (PQR) model. First, it can capture the import second-order feature information. Second, it is a convex model, so the requirements of OCO are fulfilled and the global optimal solution can be achieved. Moreover, existing modern online optimization methods such as Online Gradient Descent (OGD) or Follow-The-Regularized-Leader (FTRL) can be applied directly. In addition, by choosing a proper hyper-parameter, we show that it has the same order of space and time complexity as the linear model and thus can handle high-dimensional data. Experimental results demonstrate the performance of the proposed PQR model in terms of accuracy and efficiency by comparing with the state-of-the-art methods.

Downloads

Published

2020-04-03

How to Cite

Ma, W. (2020). Projective Quadratic Regression for Online Learning. Proceedings of the AAAI Conference on Artificial Intelligence, 34(04), 5093-5100. https://doi.org/10.1609/aaai.v34i04.5951

Issue

Section

AAAI Technical Track: Machine Learning