Generalization Analysis for Ranking Using Integral Operator
-
Published:2017-02-13
Issue:1
Volume:31
Page:
-
ISSN:2374-3468
-
Container-title:Proceedings of the AAAI Conference on Artificial Intelligence
-
language:
-
Short-container-title:AAAI
Author:
Liu Yong,Liao Shizhong,Lin Hailun,Yue Yinliang,Wang Weiping
Abstract
The study on generalization performance of ranking algorithms is one of the fundamental issues in ranking learning theory. Although several generalization bounds have been proposed based on different measures, the convergence rates of the existing bounds are usually at most O(√1/n), where n is the size of data set. In this paper, we derive novel generalization bounds for the regularized ranking in reproducing kernel Hilbert space via integral operator of kernel function. We prove that the rates of our bounds are much faster than (√1/n). Specifically, we first introduce a notion of local Rademacher complexity for ranking, called local ranking Rademacher complexity, which is used to measure the complexity of the space of loss functions of the ranking. Then, we use the local ranking Rademacher complexity to obtain a basic generalization bound. Finally, we establish the relationship between the local Rademacher complexity and the eigenvalues of integral operator, and further derive sharp generalization bounds of faster convergence rate.
Publisher
Association for the Advancement of Artificial Intelligence (AAAI)
Cited by
1 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. Learning Rates for Nonconvex Pairwise Learning;IEEE Transactions on Pattern Analysis and Machine Intelligence;2023-08