Show simple item record

dc.contributor.authorSlavov, Nikolai G
dc.contributor.authorMalioutov, Dmitry M., 1981-
dc.date.accessioned2015-05-05T16:57:18Z
dc.date.available2015-05-05T16:57:18Z
dc.date.issued2014-06
dc.identifier.issn1533-7928
dc.identifier.urihttp://hdl.handle.net/1721.1/96914
dc.description.abstractWe study the total least squares (TLS) problem that generalizes least squares regression by allowing measurement errors in both dependent and independent variables. TLS is widely used in applied fields including computer vision, system identification and econometrics. The special case when all dependent and independent variables have the same level of uncorrelated Gaussian noise, known as ordinary TLS, can be solved by singular value decomposition (SVD). However, SVD cannot solve many important practical TLS problems with realistic noise structure, such as having varying measurement noise, known structure on the errors, or large outliers requiring robust error-norms. To solve such problems, we develop convex relaxation approaches for a general class of structured TLS (STLS). We show both theoretically and experimentally, that while the plain nuclear norm relaxation incurs large approximation errors for STLS, the re-weighted nuclear norm approach is very effective, and achieves better accuracy on challenging STLS problems than popular non-convex solvers. We describe a fast solution based on augmented Lagrangian formulation, and apply our approach to an important class of biological problems that use population average measurements to infer cell-type and physiological-state specific expression levels that are very hard to measure directly.en_US
dc.language.isoen_US
dc.publisherAssociation for Computing Machinery (ACM)en_US
dc.relation.isversionofhttp://jmlr.org/proceedings/papers/v32/malioutov14.htmlen_US
dc.rightsArticle is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.en_US
dc.sourceSlavoven_US
dc.titleConvex Total Least Squaresen_US
dc.typeArticleen_US
dc.identifier.citationMalioutov, Dmitry, and Nikolai Slavov. "Convex Total Least Squares." Proceedings of the 31st International Conference on Machine Learning, Beijing, China, 2014. JMLR: W&CP volume 32.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Biologyen_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Physicsen_US
dc.contributor.approverSlavov, Nikolaien_US
dc.contributor.mitauthorSlavov, Nikolaien_US
dc.relation.journalJournal of Machine Learning Researchen_US
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/NonPeerRevieweden_US
dspace.orderedauthorsMalioutov, Dmitry; Slavov, Nikolaien_US
dc.identifier.orcidhttps://orcid.org/0000-0003-2035-1820
mit.licensePUBLISHER_POLICYen_US
mit.metadata.statusComplete


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record