AAAI Publications, Thirty-First AAAI Conference on Artificial Intelligence

Font Size: 
Fast Online Incremental Learning on Mixture Streaming Data
Yi Wang, Xin Fan, Zhongxuan Luo, Tianzhu Wang, Maomao Min, Jiebo Luo

Last modified: 2017-02-13

Abstract


The explosion of streaming data poses challenges to feature learning methods including linear discriminant analysis (LDA). Many existing LDA algorithms are not efficient enough to incrementally update with samples that sequentially arrive in various manners. First, we propose a new fast batch LDA (FLDA/QR) learning algorithm that uses the cluster centers to solve a lower triangular system that is optimized by the Cholesky-factorization. To take advantage of the intrinsically incremental mechanism of the matrix, we further develop an exact incremental algorithm (IFLDA/QR). The Gram-Schmidt process with reorthogonalization in IFLDA/QR significantly saves the space and time expenses compared with the rank-one QR-updating of most existing methods. IFLDA/QR is able to handle streaming data containing 1) new labeled samples in the existing classes, 2) samples of an entirely new (novel) class, and more significantly, 3) a chunk of examples mixed with those in 1) and 2). Both theoretical analysis and numerical experiments have demonstrated much lower space and time costs (2~10 times faster) than the state of the art, with comparable classification accuracy.

Keywords


Incremental linear discriminant analysis(ILDA);linear discriminant analysis(LDA); Online learning;QR-decomposition

Full Text: PDF