Abstract:
The selection of kernel function which determines the mapping between the input space and the feature space is of crucial importance to kernel methods. Existing kernel selection approaches commonly use some measures of generalization error, which are usually difficult to estimate and have slow convergence rates. In this paper, we propose a novel measure, called eigenvalues ratio (ER), of the tight bound of generalization error for kernel selection. ER is the ration between the sum of the main eigenvalues and that of the tail eigenvalues of the kernel matrix. Defferent from most of existing measures, ER is defined on the kernel matrxi, so it can be estimated easily from the available training data, which makes it usable for kernel selection. We establish tight ER-based generalization error bounds of order $O(frac{1}{n})$ for several kernel-based methods under certain general conditions, while for most of existing measures, the convergence rate is at most $O(frac{1}{sqrt{n}})$. Finally, to guarantee good generalization performance, we propose a novel kernel selection criterion by minimizing the derived tight generalization error bounds. Theoretical analysis and experimental results demonstrate that our kernel selection criterion is a good choice for kernel seletion.
DOI:
10.1609/aaai.v29i1.9554