Nonparametric Sparsity and Regularization
Author(s)
Rosasco, Lorenzo Andrea; Villa, Silvia; Mosci, Sofia; Santoro, Matteo; Verri, Alessandro
DownloadRosasco_Nonparametric-sparsity.pdf (587.1Kb)
PUBLISHER_POLICY
Publisher Policy
Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.
Terms of use
Metadata
Show full item recordAbstract
In this work we are interested in the problems of supervised learning and variable selection when the input-output dependence is described by a nonlinear function depending on a few variables. Our goal is to consider a sparse nonparametric model, hence avoiding linear or additive models. The key idea is to measure the importance of each variable in the model by making use of partial derivatives. Based on this intuition we propose a new notion of nonparametric sparsity and a corresponding least squares regularization scheme. Using concepts and results from the theory of reproducing kernel Hilbert spaces and proximal methods, we show that the proposed learning algorithm corresponds to a minimization problem which can be provably solved by an iterative procedure. The consistency properties of the obtained estimator are studied both in terms of prediction and selection performance. An extensive empirical analysis shows that the proposed method performs favorably with respect to the state-of-the-art methods.
Date issued
2013-07Department
Massachusetts Institute of Technology. Department of Brain and Cognitive SciencesJournal
Journal of Machine Learning Research
Publisher
Association for Computing Machinery (ACM)
Citation
Rosasco, Lorenzo et al. “Nonparametric Sparsity and Regularization.” Journal of Machine Learning Research 14 (2013): 1665–1714.
Version: Final published version
ISSN
1532-4435
1533-7928