Show simple item record

dc.contributor.authorBelloni, Alexandre
dc.contributor.authorWang, Lie
dc.contributor.authorChernozhukov, Victor V.
dc.date.accessioned2015-01-29T15:56:22Z
dc.date.available2015-01-29T15:56:22Z
dc.date.issued2014-04
dc.date.submitted2013-12
dc.identifier.issn0090-5364
dc.identifier.urihttp://hdl.handle.net/1721.1/93187
dc.description.abstractWe propose a self-tuning √Lasso method that simultaneously resolves three important practical problems in high-dimensional regression analysis, namely it handles the unknown scale, heteroscedasticity and (drastic) non-Gaussianity of the noise. In addition, our analysis allows for badly behaved designs, for example, perfectly collinear regressors, and generates sharp bounds even in extreme cases, such as the infinite variance case and the noiseless case, in contrast to Lasso. We establish various nonasymptotic bounds for√Lasso including prediction norm rate and sparsity. Our analysis is based on new impact factors that are tailored for bounding prediction norm. In order to cover heteroscedastic non-Gaussian noise, we rely on moderate deviation theory for self-normalized sums to achieve Gaussian-like results under weak conditions. Moreover, we derive bounds on the performance of ordinary least square (ols) applied to the model selected by √Lasso accounting for possible misspecification of the selected model. Under mild conditions, the rate of convergence of ols post √Lasso is as good as √Lasso’s rate. As an application, we consider the use of √Lasso and ols post √Lasso as estimators of nuisance parameters in a generic semiparametric problem (nonlinear moment condition or Z-problem), resulting in a construction of √n-consistent and asymptotically normal estimators of the main parameters.en_US
dc.description.sponsorshipNational Science Foundation (U.S.)en_US
dc.language.isoen_US
dc.publisherInstitute of Mathematical Statisticsen_US
dc.relation.isversionofhttp://dx.doi.org/10.1214/14-AOS1204en_US
dc.rightsCreative Commons Attribution-Noncommercial-Share Alikeen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/4.0/en_US
dc.sourcearXiven_US
dc.titlePivotal estimation via square-root Lasso in nonparametric regressionen_US
dc.typeArticleen_US
dc.identifier.citationBelloni, Alexandre, Victor Chernozhukov, and Lie Wang. “Pivotal Estimation via Square-Root Lasso in Nonparametric Regression.” Ann. Statist. 42, no. 2 (April 2014): 757–788.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Economicsen_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Mathematicsen_US
dc.contributor.mitauthorWang, Lieen_US
dc.contributor.mitauthorChernozhukov, Victor V.en_US
dc.relation.journalAnnals of Statisticsen_US
dc.eprint.versionAuthor's final manuscripten_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/PeerRevieweden_US
dspace.orderedauthorsBelloni, Alexandre; Chernozhukov, Victor; Wang, Lieen_US
dc.identifier.orcidhttps://orcid.org/0000-0003-3582-8898
dc.identifier.orcidhttps://orcid.org/0000-0002-3250-6714
mit.licenseOPEN_ACCESS_POLICYen_US
mit.metadata.statusComplete


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record