Show simple item record

dc.contributor.authorDo Ba, Khanh
dc.contributor.authorIndyk, Piotr
dc.date.accessioned2012-10-12T18:12:50Z
dc.date.available2012-10-12T18:12:50Z
dc.date.issued2011-08
dc.date.submitted2011-08
dc.identifier.isbn978-3-642-22934-3
dc.identifier.urihttp://hdl.handle.net/1721.1/73940
dc.description14th International Workshop, APPROX 2011, and 15th International Workshop, RANDOM 2011, Princeton, NJ, USA, August 17-19, 2011. Proceedingsen_US
dc.description.abstractThe goal of sparse recovery is to recover the (approximately) best k-sparse approximation [ˆ over x] of an n-dimensional vector x from linear measurements Ax of x. We consider a variant of the problem which takes into account partial knowledge about the signal. In particular, we focus on the scenario where, after the measurements are taken, we are given a set S of size s that is supposed to contain most of the “large” coefficients of x. The goal is then to find [ˆ over x] such that [ ||x-[ˆ over x]|| [subscript p] ≤ C min ||x-x'||[subscript q]. [over] k-sparse x' [over] supp (x') [c over _] S] We refer to this formulation as the sparse recovery with partial support knowledge problem ( SRPSK ). We show that SRPSK can be solved, up to an approximation factor of C = 1 + ε, using O( (k/ε) log(s/k)) measurements, for p = q = 2. Moreover, this bound is tight as long as s = O(εn / log(n/ε)). This completely resolves the asymptotic measurement complexity of the problem except for a very small range of the parameter s. To the best of our knowledge, this is the first variant of (1 + ε)-approximate sparse recovery for which the asymptotic measurement complexity has been determined.en_US
dc.description.sponsorshipSpace and Naval Warfare Systems Center San Diego (U.S.) (Contract N66001-11-C-4092)en_US
dc.description.sponsorshipDavid & Lucile Packard Foundation (Fellowship)en_US
dc.description.sponsorshipCenter for Massive Data Algorithmics (MADALGO)en_US
dc.description.sponsorshipNational Science Foundation (U.S.) (Grant CCF-0728645)en_US
dc.description.sponsorshipNational Science Foundation (U.S.) (Grant CCF-1065125)en_US
dc.language.isoen_US
dc.publisherSpringer Berlin / Heidelbergen_US
dc.relation.isversionofhttp://dx.doi.org/10.1007/978-3-642-22935-0_3en_US
dc.rightsCreative Commons Attribution-Noncommercial-Share Alike 3.0en_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/3.0/en_US
dc.sourceMIT web domainen_US
dc.titleSparse recovery with partial support knowledgeen_US
dc.typeArticleen_US
dc.identifier.citationBa, Khanh Do, and Piotr Indyk. “Sparse Recovery with Partial Support Knowledge.” Approximation, Randomization, and Combinatorial Optimization. Algorithms and Techniques. Ed. Leslie Ann Goldberg et al. LNCS Vol. 6845. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. 26–37.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Scienceen_US
dc.contributor.mitauthorDo Ba, Khanh
dc.contributor.mitauthorIndyk, Piotr
dc.relation.journalApproximation, Randomization, and Combinatorial Optimization. Algorithms and Techniquesen_US
dc.eprint.versionAuthor's final manuscripten_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
dspace.orderedauthorsBa, Khanh Do; Indyk, Piotren
dc.identifier.orcidhttps://orcid.org/0000-0002-7983-9524
mit.licenseOPEN_ACCESS_POLICYen_US
mit.metadata.statusComplete


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record