AAAI Publications, Thirty-First AAAI Conference on Artificial Intelligence

Font Size: 
Generalization Analysis for Ranking Using Integral Operator
Yong Liu, Shizhong Liao, Hailun Lin, Yinliang Yue, Weiping Wang

Last modified: 2017-02-13


The study on generalization performance of ranking algorithms is one of the fundamental issues in ranking learning theory. Although several generalization bounds have been proposed based on different measures, the convergence rates of the existing bounds are usually at most O(√1/n), where n is the size of data set. In this paper, we derive novel generalization bounds for the regularized ranking in reproducing kernel Hilbert space via integral operator of kernel function. We prove that the rates of our bounds are much faster than (√1/n). Specifically, we first introduce a notion of local Rademacher complexity for ranking, called local ranking  Rademacher complexity, which is used to measure the complexity of the space of loss functions of the ranking. Then, we use the local ranking Rademacher complexity to obtain a basic generalization bound. Finally, we establish the relationship between the local Rademacher complexity and the eigenvalues of integral operator, and further derive sharp generalization bounds of faster convergence rate.


Generalization Analysis; Ranking; Integral Operator

Full Text: PDF