MIT Libraries homeMIT Libraries logoDSpace@MIT

MIT
View Item 
  • DSpace@MIT Home
  • MIT Open Access Articles
  • MIT Open Access Articles
  • View Item
  • DSpace@MIT Home
  • MIT Open Access Articles
  • MIT Open Access Articles
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Scaling law for recovering the sparsest element in a subspace

Author(s)
Demanet, Laurent; Hand, Paul
Thumbnail
Download1310.1654.pdf (195.1Kb)
OPEN_ACCESS_POLICY

Open Access Policy

Creative Commons Attribution-Noncommercial-Share Alike

Terms of use
Creative Commons Attribution-Noncommercial-Share Alike http://creativecommons.org/licenses/by-nc-sa/4.0/
Metadata
Show full item record
Abstract
We address the problem of recovering a sparse n-vector within a given subspace. This problem is a subtask of some approaches to dictionary learning and sparse principal component analysis. Hence, if we can prove scaling laws for recovery of sparse vectors, it will be easier to derive and prove recovery results in these applications. In this paper, we present a scaling law for recovering the sparse vector from a subspace that is spanned by the sparse vector and k random vectors. We prove that the sparse vector will be the output to one of n linear programs with high probability if its support size s satisfies s≲n√/klogn. The scaling law still holds when the desired vector is approximately sparse. To get a single estimate for the sparse vector from the n linear programs, we must select which output is the sparsest. This selection process can be based on any proxy for sparsity, and the specific proxy has the potential to improve or worsen the scaling law. If sparsity is interpreted in an ℓ1/ℓ∞ sense, then the scaling law cannot be better than s≲n/√k. Computer simulations show that selecting the sparsest output in the ℓ1/ℓ2 or thresholded-ℓ0 senses can lead to a larger parameter range for successful recovery than that given by the ℓ1/ℓ∞ sense.
Date issued
2014-07
URI
http://hdl.handle.net/1721.1/115483
Department
Massachusetts Institute of Technology. Department of Mathematics
Journal
Information and Inference
Publisher
Oxford University Press (OUP)
Citation
Demanet, L., and P. Hand. “Scaling Law for Recovering the Sparsest Element in a Subspace.” Information and Inference 3, 4 (July 2014): 295–309 © 2014 The Authors
Version: Author's final manuscript
ISSN
2049-8764
2049-8772

Collections
  • MIT Open Access Articles

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

My Account

Login

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries homeMIT Libraries logo

Find us on

Twitter Facebook Instagram YouTube RSS

MIT Libraries navigation

SearchHours & locationsBorrow & requestResearch supportAbout us
PrivacyPermissionsAccessibility
MIT
Massachusetts Institute of Technology
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.