Risk Bounds for Regularized Least-squares Algorithm with Operator-valued kernels
Author(s)
Vito, Ernesto De; Caponnetto, Andrea
DownloadMIT-CSAIL-TR-2005-031.ps (11807Kb)
Additional downloads
Metadata
Show full item recordAbstract
We show that recent results in [3] on risk bounds for regularized least-squares on reproducing kernel Hilbert spaces can be straightforwardly extended to the vector-valued regression setting. We first briefly introduce central concepts on operator-valued kernels. Then we show how risk bounds can be expressed in terms of a generalization of effective dimension.
Date issued
2005-05-16Other identifiers
MIT-CSAIL-TR-2005-031
AIM-2005-015
CBCL-249
Series/Report no.
Massachusetts Institute of Technology Computer Science and Artificial Intelligence Laboratory
Keywords
AI, optimal rates, reproducing kernel Hilbert space, effective dimension