Show simple item record

dc.contributor.authorVarshney, Kush R.
dc.contributor.authorWillsky, Alan S.
dc.contributor.authorWillsky, Alan
dc.date.accessioned2013-09-26T20:31:19Z
dc.date.available2013-09-26T20:31:19Z
dc.date.issued2011-06
dc.date.submitted2010-12
dc.identifier.issn1053-587X
dc.identifier.issn1941-0476
dc.identifier.urihttp://hdl.handle.net/1721.1/81207
dc.description.abstractLow-dimensional statistics of measurements play an important role in detection problems, including those encountered in sensor networks. In this work, we focus on learning low-dimensional linear statistics of high-dimensional measurement data along with decision rules defined in the low-dimensional space in the case when the probability density of the measurements and class labels is not given, but a training set of samples from this distribution is given. We pose a joint optimization problem for linear dimensionality reduction and margin-based classification, and develop a coordinate descent algorithm on the Stiefel manifold for its solution. Although the coordinate descent is not guaranteed to find the globally optimal solution, crucially, its alternating structure enables us to extend it for sensor networks with a message-passing approach requiring little communication. Linear dimensionality reduction prevents overfitting when learning from finite training data. In the sensor network setting, dimensionality reduction not only prevents overfitting, but also reduces power consumption due to communication. The learned reduced-dimensional space and decision rule is shown to be consistent and its Rademacher complexity is characterized. Experimental results are presented for a variety of datasets, including those from existing sensor networks, demonstrating the potential of our methodology in comparison with other dimensionality reduction approaches.en_US
dc.description.sponsorshipNational Science Foundation (U.S.). Graduate Research Fellowship Programen_US
dc.description.sponsorshipUnited States. Army Research Office (MURI funded through ARO Grant W911NF-06-1-0076)en_US
dc.description.sponsorshipUnited States. Air Force Office of Scientific Research (Award FA9550-06-1-0324)en_US
dc.description.sponsorshipShell International Exploration and Production B.V.en_US
dc.language.isoen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.relation.isversionofhttp://dx.doi.org/10.1109/tsp.2011.2123891en_US
dc.rightsCreative Commons Attribution-Noncommercial-Share Alike 3.0en_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/3.0/en_US
dc.sourceWillsky via Amy Stouten_US
dc.titleLinear Dimensionality Reduction for Margin-Based Classification: High-Dimensional Data and Sensor Networksen_US
dc.typeArticleen_US
dc.identifier.citationVarshney, Kush R., and Alan S. Willsky. Linear Dimensionality Reduction for Margin-Based Classification: High-Dimensional Data and Sensor Networks. IEEE Transactions on Signal Processing 59, no. 6 (June 2011): 2496-2512.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Scienceen_US
dc.contributor.departmentMassachusetts Institute of Technology. Laboratory for Information and Decision Systemsen_US
dc.contributor.mitauthorWillsky, Alanen_US
dc.contributor.mitauthorVarshney, Kush R.en_US
dc.relation.journalIEEE Transactions on Signal Processingen_US
dc.eprint.versionAuthor's final manuscripten_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/PeerRevieweden_US
dspace.orderedauthorsVarshney, Kush R.; Willsky, Alan S.en_US
dc.identifier.orcidhttps://orcid.org/0000-0003-0149-5888
mit.licenseOPEN_ACCESS_POLICYen_US
mit.metadata.statusComplete


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record