MIT Libraries logoDSpace@MIT

MIT
View Item 
  • DSpace@MIT Home
  • MIT Libraries
  • MIT Theses
  • Graduate Theses
  • View Item
  • DSpace@MIT Home
  • MIT Libraries
  • MIT Theses
  • Graduate Theses
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Dynamical systems view of acceleration in first order optimization

Author(s)
Zhang, Jingzhao,S.M.Massachusetts Institute of Technology.
Thumbnail
Download1126787253-MIT.pdf (4.252Mb)
Other Contributors
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science.
Advisor
Suvrit Sra and Ali Jadbabaie.
Terms of use
MIT theses are protected by copyright. They may be viewed, downloaded, or printed from this source but further reproduction or distribution in any format is prohibited without written permission. http://dspace.mit.edu/handle/1721.1/7582
Metadata
Show full item record
Abstract
Gradient based optimization algorithms are among the most fundamental algorithms in optimization and machine learning, yet they suffer from slow convergence. Consequently, accelerating gradient based methods have become an important recent topic of study. In this thesis, we focus on explaining and understanding the acceleration results. In particular, we aim to provide insights into the acceleration phenomenon and further develop new algorithms based on this interpretation. To do so, we follow the line of work on the continuous ordinary differential equation representations of momentum based acceleration methods. We start by proving that acceleration can be achieved by stable discretization of ODEs using standard Runge-Kutta integrators when the function is smooth enough and convex. We then extend this idea and develop a distributed algorithm for solving convex finite sum problems over networks. Our proposed algorithm achieves acceleration without resorting to Nesterov's momentum approach. Finally we generalize the result to functions that are quasi-strongly convex but not necessarily convex. We show that acceleration can be achieved in a nontrivial neighborhood of the optimal solution. In particular, the neighborhood can grow larger as the condition number of the function increases. The results altogether provide a systematic way to prove nonasymptotic convergence rates of algorithms derived from ODE discretization.
Description
Thesis: S.M., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2019
 
Cataloged from PDF version of thesis.
 
Includes bibliographical references (pages 85-87).
 
Date issued
2019
URI
https://hdl.handle.net/1721.1/122886
Department
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
Publisher
Massachusetts Institute of Technology
Keywords
Electrical Engineering and Computer Science.

Collections
  • Graduate Theses

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

My Account

Login

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries
PrivacyPermissionsAccessibilityContact us
MIT
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.