| dc.contributor.advisor | Suvrit Sra and Ali Jadbabaie. | en_US |
| dc.contributor.author | Zhang, Jingzhao,S.M.Massachusetts Institute of Technology. | en_US |
| dc.contributor.other | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science. | en_US |
| dc.date.accessioned | 2019-11-12T17:41:40Z | |
| dc.date.available | 2019-11-12T17:41:40Z | |
| dc.date.copyright | 2019 | en_US |
| dc.date.issued | 2019 | en_US |
| dc.identifier.uri | https://hdl.handle.net/1721.1/122886 | |
| dc.description | Thesis: S.M., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2019 | en_US |
| dc.description | Cataloged from PDF version of thesis. | en_US |
| dc.description | Includes bibliographical references (pages 85-87). | en_US |
| dc.description.abstract | Gradient based optimization algorithms are among the most fundamental algorithms in optimization and machine learning, yet they suffer from slow convergence. Consequently, accelerating gradient based methods have become an important recent topic of study. In this thesis, we focus on explaining and understanding the acceleration results. In particular, we aim to provide insights into the acceleration phenomenon and further develop new algorithms based on this interpretation. To do so, we follow the line of work on the continuous ordinary differential equation representations of momentum based acceleration methods. We start by proving that acceleration can be achieved by stable discretization of ODEs using standard Runge-Kutta integrators when the function is smooth enough and convex. We then extend this idea and develop a distributed algorithm for solving convex finite sum problems over networks. Our proposed algorithm achieves acceleration without resorting to Nesterov's momentum approach. Finally we generalize the result to functions that are quasi-strongly convex but not necessarily convex. We show that acceleration can be achieved in a nontrivial neighborhood of the optimal solution. In particular, the neighborhood can grow larger as the condition number of the function increases. The results altogether provide a systematic way to prove nonasymptotic convergence rates of algorithms derived from ODE discretization. | en_US |
| dc.description.statementofresponsibility | by Jingzhao Zhang. | en_US |
| dc.format.extent | 87 pages | en_US |
| dc.language.iso | eng | en_US |
| dc.publisher | Massachusetts Institute of Technology | en_US |
| dc.rights | MIT theses are protected by copyright. They may be viewed, downloaded, or printed from this source but further reproduction or distribution in any format is prohibited without written permission. | en_US |
| dc.rights.uri | http://dspace.mit.edu/handle/1721.1/7582 | en_US |
| dc.subject | Electrical Engineering and Computer Science. | en_US |
| dc.title | Dynamical systems view of acceleration in first order optimization | en_US |
| dc.type | Thesis | en_US |
| dc.description.degree | S.M. | en_US |
| dc.contributor.department | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science | en_US |
| dc.identifier.oclc | 1126787253 | en_US |
| dc.description.collection | S.M. Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science | en_US |
| dspace.imported | 2019-11-12T17:41:39Z | en_US |
| mit.thesis.degree | Master | en_US |
| mit.thesis.department | EECS | en_US |