Show simple item record

dc.contributor.advisorPiotr Indyk.en_US
dc.contributor.authorSchmidt. Ludwig, Ph. D. Massachusetts Institute of Technologyen_US
dc.contributor.otherMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Science.en_US
dc.date.accessioned2018-09-17T15:57:26Z
dc.date.available2018-09-17T15:57:26Z
dc.date.copyright2018en_US
dc.date.issued2018en_US
dc.identifier.urihttp://hdl.handle.net/1721.1/118098
dc.descriptionThesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2018.en_US
dc.descriptionCataloged from PDF version of thesis.en_US
dc.descriptionIncludes bibliographical references (pages 281-297).en_US
dc.description.abstractMany success stories in the data sciences share an intriguing computational phenomenon. While the core algorithmic problems might seem intractable at first, simple heuristics or approximation algorithms often perform surprisingly well in practice. Common examples include optimizing non-convex functions or optimizing over non-convex sets. In theory, such problems are usually NP-hard. But in practice, they are often solved sufficiently well for applications in machine learning and statistics. Even when a problem is convex, we often settle for sub-optimal solutions returned by inexact methods like stochastic gradient descent. And in nearest neighbor search, a variety of approximation algorithms works remarkably well despite the "curse of dimensionality". In this thesis, we study this phenomenon in the context of three fundamental algorithmic problems arising in the data sciences. * In constrained optimization, we show that it is possible to optimize over a wide range of non-convex sets up to the statistical noise floor. * In unconstrained optimization, we prove that important convex problems already require approximation if we want to find a solution quickly. * In nearest neighbor search, we show that approximation guarantees can explain much of the good performance observed in practice. The overarching theme is that the computational hardness of many problems emerges only below the inherent "noise floor" of real data. Hence computational hardness of these problems does not prevent us from finding answers that perform well from a statistical perspective. This offers an explanation for why algorithmic problems in the data sciences often turn out to be easier than expected.en_US
dc.description.statementofresponsibilityby Ludwig Schmidt.en_US
dc.format.extent297 pagesen_US
dc.language.isoengen_US
dc.publisherMassachusetts Institute of Technologyen_US
dc.rightsMIT theses are protected by copyright. They may be viewed, downloaded, or printed from this source but further reproduction or distribution in any format is prohibited without written permission.en_US
dc.rights.urihttp://dspace.mit.edu/handle/1721.1/7582en_US
dc.subjectElectrical Engineering and Computer Science.en_US
dc.titleAlgorithms above the noise flooren_US
dc.typeThesisen_US
dc.description.degreePh. D.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
dc.identifier.oclc1052124168en_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record