Notice
This is not the latest version of this item. The latest version can be found at:https://dspace.mit.edu/handle/1721.1/137333.2
High dimensional linear regression using lattice basis reduction
| dc.date.accessioned | 2021-11-04T14:46:45Z | |
| dc.date.available | 2021-11-04T14:46:45Z | |
| dc.date.issued | 2018-12 | |
| dc.identifier.uri | https://hdl.handle.net/1721.1/137333 | |
| dc.description.abstract | © 2018 Curran Associates Inc.All rights reserved. We consider a high dimensional linear regression problem where the goal is to efficiently recover an unknown vector β∗ from n noisy linear observations Y = Xβ∗ + W ∈ Rn, for known X ∈ Rn×p and unknown W ∈ Rn. Unlike most of the literature on this model we make no sparsity assumption on β∗. Instead we adopt a regularization based on assuming that the underlying vectors β∗ have rational entries with the same denominator Q ∈ Z>0. We call this Q-rationality assumption. We propose a new polynomial-time algorithm for this task which is based on the seminal Lenstra-Lenstra-Lovasz (LLL) lattice basis reduction algorithm. We establish that under the Q-rationality assumption, our algorithm recovers exactly the vector β∗ for a large class of distributions for the iid entries of X and non-zero noise W. We prove that it is successful under small noise, even when the learner has access to only one observation (n = 1). Furthermore, we prove that in the case of the Gaussian white noise for W, n = o(p/log p) and Q sufficiently large, our algorithm tolerates a nearly optimal information-theoretic level of the noise. | en_US |
| dc.language.iso | en | |
| dc.relation.isversionof | https://papers.nips.cc/paper/2018/hash/ccc0aa1b81bf81e16c676ddb977c5881-Abstract.html | en_US |
| dc.rights | Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use. | en_US |
| dc.source | Neural Information Processing Systems (NIPS) | en_US |
| dc.title | High dimensional linear regression using lattice basis reduction | en_US |
| dc.type | Article | en_US |
| dc.identifier.citation | 2018. "High dimensional linear regression using lattice basis reduction." Advances in Neural Information Processing Systems, 2018-December. | |
| dc.relation.journal | Advances in Neural Information Processing Systems | en_US |
| dc.eprint.version | Final published version | en_US |
| dc.type.uri | http://purl.org/eprint/type/ConferencePaper | en_US |
| eprint.status | http://purl.org/eprint/status/NonPeerReviewed | en_US |
| dc.date.updated | 2021-03-30T13:56:27Z | |
| dspace.orderedauthors | Gamarnik, D; Zadik, I | en_US |
| dspace.date.submission | 2021-03-30T13:56:28Z | |
| mit.journal.volume | 2018-December | en_US |
| mit.license | PUBLISHER_POLICY | |
| mit.metadata.status | Authority Work and Publication Information Needed | en_US |
