Show simple item record

dc.contributor.authorPontil, Massimilianoen_US
dc.contributor.authorMukherjee, Sayanen_US
dc.contributor.authorGirosi, Federicoen_US
dc.date.accessioned2004-10-20T21:04:28Z
dc.date.available2004-10-20T21:04:28Z
dc.date.issued1998-10-01en_US
dc.identifier.otherAIM-1651en_US
dc.identifier.otherCBCL-168en_US
dc.identifier.urihttp://hdl.handle.net/1721.1/7259
dc.description.abstractSupport Vector Machines Regression (SVMR) is a regression technique which has been recently introduced by V. Vapnik and his collaborators (Vapnik, 1995; Vapnik, Golowich and Smola, 1996). In SVMR the goodness of fit is measured not by the usual quadratic loss function (the mean square error), but by a different loss function called Vapnik"s $epsilon$- insensitive loss function, which is similar to the "robust" loss functions introduced by Huber (Huber, 1981). The quadratic loss function is well justified under the assumption of Gaussian additive noise. However, the noise model underlying the choice of Vapnik's loss function is less clear. In this paper the use of Vapnik's loss function is shown to be equivalent to a model of additive and Gaussian noise, where the variance and mean of the Gaussian are random variables. The probability distributions for the variance and mean will be stated explicitly. While this work is presented in the framework of SVMR, it can be extended to justify non-quadratic loss functions in any Maximum Likelihood or Maximum A Posteriori approach. It applies not only to Vapnik's loss function, but to a much broader class of loss functions.en_US
dc.format.extent2520205 bytes
dc.format.extent186978 bytes
dc.format.mimetypeapplication/postscript
dc.format.mimetypeapplication/pdf
dc.language.isoen_US
dc.relation.ispartofseriesAIM-1651en_US
dc.relation.ispartofseriesCBCL-168en_US
dc.titleOn the Noise Model of Support Vector Machine Regressionen_US


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record