Show simple item record

dc.contributor.advisorConstantinos Daskalakis.en_US
dc.contributor.authorZampetakis, Emmanouilen_US
dc.contributor.otherMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Science.en_US
dc.date.accessioned2017-05-11T19:58:35Z
dc.date.available2017-05-11T19:58:35Z
dc.date.copyright2017en_US
dc.date.issued2017en_US
dc.identifier.urihttp://hdl.handle.net/1721.1/108973
dc.descriptionThesis: S.M., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2017.en_US
dc.descriptionCataloged from PDF version of thesis.en_US
dc.descriptionIncludes bibliographical references (pages 101-107).en_US
dc.description.abstractThe increasing interest of the scientific community, and especially machine learning, on non-convex problems, has made non-convex optimization one of the most important and challenging areas of our days. Despite of this increasing interest too little is known from a theoretical point of view. The main reason for this is that the existing and well understood techniques used for the analysis of convex optimization problem are not applicable or meaningful in the non-convex case. The purpose of this thesis is to make a step in the direction of investigating a rich enough toolbox, to be able to analyze non-convex optimization. Contraction maps and Banach's Fixed Point Theorem are very important tools for bounding the running time of a big class of iterative algorithms used to solve non-convex problems. But when we use the natural distance metric, of the spaces that we are working on, the applicability of Banach's Fixed Point Theorem becomes limited. The reason is that only few functions have the contraction property with the natural metrics. We explore how generally we can apply Banach's fixed point theorem to establish the convergence of iterative methods when pairing it with carefully designed metrics. Our first result is a strong converse of Banach's theorem, showing that it is a universal analysis tool for establishing uniqueness of fixed points and convergence of iterative maps to a unique solution. We next consider the computational complexity of Banach's fixed point theorem. Making the proof of our converse theorem constructive, we show that computing Banach's fixed point theorem is CLS-complete, answering a question left open in the work of Daskalakis and Papadimitriou [23]. Finally, we turn to applications proving global convergence guarantees for one of the most celebrated inference algorithms in Statistics, the EM algorithm. Proposed in the 70's [26], the EM algorithm is an iterative method for maximum likelihood estimation whose behavior has vastly remained elusive. We show that it converges to the true optimum for balanced mixtures of two Gaussians.en_US
dc.description.statementofresponsibilityby Emmanouil Zampetakis.en_US
dc.format.extent107 pagesen_US
dc.language.isoengen_US
dc.publisherMassachusetts Institute of Technologyen_US
dc.rightsMIT theses are protected by copyright. They may be viewed, downloaded, or printed from this source but further reproduction or distribution in any format is prohibited without written permission.en_US
dc.rights.urihttp://dspace.mit.edu/handle/1721.1/7582en_US
dc.subjectElectrical Engineering and Computer Science.en_US
dc.titleContraction maps and applications to the analysis of iterative algorithmsen_US
dc.typeThesisen_US
dc.description.degreeS.M.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
dc.identifier.oclc986497232en_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record