AAAI Publications, Thirty-First AAAI Conference on Artificial Intelligence

Font Size: 
Efficient Stochastic Optimization for Low-Rank Distance Metric Learning
Jie Zhang, Lijun Zhang

Last modified: 2017-02-12

Abstract


Although distance metric learning has been successfully applied to many real-world applications, learning a distance metric from large-scale and high-dimensional data remains a challenging problem. Due to the PSD constraint, the computational complexity of previous algorithms per iteration is at least O(d2) where d is the dimensionality of the data.In this paper, we develop an efficient stochastic algorithm  for a class of distance metric learning problems with nuclear norm regularization, referred to as low-rank DML. By utilizing the low-rank structure of the intermediate solutions and stochastic gradients, the complexity of our algorithm has a linear dependence on the dimensionality d. The key idea is to maintain all the iterates  in factorized representations  and construct  stochastic gradients that are low-rank. In this way, the projection onto the PSD cone can be implemented efficiently by incremental SVD. Experimental results on several data sets validate the effectiveness and efficiency of our method.

Keywords


Stochastic Optimization; Low-Rank DML; Incremental SVD

Full Text: PDF