Show simple item record

dc.contributor.authorRosasco, Lorenzo Andrea
dc.contributor.authorVilla, Silvia
dc.contributor.authorMosci, Sofia
dc.contributor.authorSantoro, Matteo
dc.contributor.authorVerri, Alessandro
dc.date.accessioned2013-10-18T13:09:05Z
dc.date.available2013-10-18T13:09:05Z
dc.date.issued2013-07
dc.date.submitted2012-08
dc.identifier.issn1532-4435
dc.identifier.issn1533-7928
dc.identifier.urihttp://hdl.handle.net/1721.1/81424
dc.description.abstractIn this work we are interested in the problems of supervised learning and variable selection when the input-output dependence is described by a nonlinear function depending on a few variables. Our goal is to consider a sparse nonparametric model, hence avoiding linear or additive models. The key idea is to measure the importance of each variable in the model by making use of partial derivatives. Based on this intuition we propose a new notion of nonparametric sparsity and a corresponding least squares regularization scheme. Using concepts and results from the theory of reproducing kernel Hilbert spaces and proximal methods, we show that the proposed learning algorithm corresponds to a minimization problem which can be provably solved by an iterative procedure. The consistency properties of the obtained estimator are studied both in terms of prediction and selection performance. An extensive empirical analysis shows that the proposed method performs favorably with respect to the state-of-the-art methods.en_US
dc.description.sponsorshipUnited States. Defense Advanced Research Projects Agency. Information Processing Techniques Officeen_US
dc.description.sponsorshipUnited States. Defense Advanced Research Projects Agency. System Science Division. Defense Sciences Officeen_US
dc.description.sponsorshipNational Science Foundation (U.S.) (Grant NSF-0640097)en_US
dc.description.sponsorshipNational Science Foundation (U.S.) (Grant NSF-0827427)en_US
dc.description.sponsorshipAdobe Systemsen_US
dc.description.sponsorshipHonda Research Institute USA, Inc.en_US
dc.description.sponsorshipEugene McDermott Foundationen_US
dc.description.sponsorshipSony Corporationen_US
dc.description.sponsorshipNECen_US
dc.language.isoen_US
dc.publisherAssociation for Computing Machinery (ACM)en_US
dc.relation.isversionofhttp://jmlr.org/papers/volume14/rosasco13a/rosasco13a.pdfen_US
dc.rightsArticle is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.en_US
dc.sourceMIT Pressen_US
dc.titleNonparametric Sparsity and Regularizationen_US
dc.typeArticleen_US
dc.identifier.citationRosasco, Lorenzo et al. “Nonparametric Sparsity and Regularization.” Journal of Machine Learning Research 14 (2013): 1665–1714.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Brain and Cognitive Sciencesen_US
dc.contributor.mitauthorRosasco, Lorenzo Andreaen_US
dc.relation.journalJournal of Machine Learning Researchen_US
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/PeerRevieweden_US
dspace.orderedauthorsRosasco, Lorenzo; Villa, Silvia; Mosci, Sofia; Santoro, Matteo; Verri, Alessandroen_US
dc.identifier.orcidhttps://orcid.org/0000-0001-6376-4786
mit.licensePUBLISHER_POLICYen_US
mit.metadata.statusComplete


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record