Risk Bounds for Regularized Least-squares Algorithm with Operator-valued kernels
Author(s)Vito, Ernesto De; Caponnetto, Andrea
We show that recent results in  on risk bounds for regularized least-squares on reproducing kernel Hilbert spaces can be straightforwardly extended to the vector-valued regression setting. We first briefly introduce central concepts on operator-valued kernels. Then we show how risk bounds can be expressed in terms of a generalization of effective dimension.
Massachusetts Institute of Technology Computer Science and Artificial Intelligence Laboratory
AI, optimal rates, reproducing kernel Hilbert space, effective dimension