Show simple item record

dc.contributor.authorCaponnetto, Andrea
dc.contributor.authorVito, Ernesto De
dc.date.accessioned2005-12-22T02:28:27Z
dc.date.available2005-12-22T02:28:27Z
dc.date.issued2005-04-14
dc.identifier.otherMIT-CSAIL-TR-2005-027
dc.identifier.otherAIM-2005-013
dc.identifier.otherCBCL-248
dc.identifier.urihttp://hdl.handle.net/1721.1/30539
dc.description.abstractWe develop a theoretical analysis of generalization performances of regularized least-squares on reproducing kernel Hilbert spaces for supervised learning. We show that the concept of effective dimension of an integral operator plays a central role in the definition of a criterion for the choice of the regularization parameter as a function of the number of samples. In fact, a minimax analysis is performed which shows asymptotic optimality of the above-mentioned criterion.
dc.format.extent25 p.
dc.format.extent16130108 bytes
dc.format.extent833989 bytes
dc.format.mimetypeapplication/postscript
dc.format.mimetypeapplication/pdf
dc.language.isoen_US
dc.relation.ispartofseriesMassachusetts Institute of Technology Computer Science and Artificial Intelligence Laboratory
dc.subjectAI
dc.subjectoptimal rates
dc.subjectregularized least-squares
dc.subjectreproducing kernel Hilbert space
dc.subjecteffe
dc.titleFast Rates for Regularized Least-squares Algorithm


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record