Publication
NeurIPS 2002
Conference paper

Effective Dimension and Generalization of Kernel Learning

Abstract

We investigate the generalization performance of some learning problems in Hilbert function Spaces. We introduce a concept of scale-sensitive effective data dimension, and show that it characterizes the convergence rate of the underlying learning problem. Using this concept, we can naturally extend results for parametric estimation problems in finite dimensional spaces to non-parametric kernel learning methods. We derive upper bounds on the generalization performance and show that the resulting convergent rates are optimal under various circumstances.

Date

Publication

NeurIPS 2002

Authors

Share