From Proximal Point Method to Accelerated Methods on Riemannian Manifolds
Author(s)
Ahn, Kwangjun
DownloadThesis PDF (902.9Kb)
Advisor
Sra, Suvrit
Terms of use
Metadata
Show full item recordAbstract
Recently, there has been significant effort to generalize successful ideas in Euclidean optimization to Riemannian optimization. However, one landmark result of Euclidean optimization has eluded the Riemannian setting: namely, a Riemannian analog of Nesterov's accelerated gradient method (AGM). In this thesis, we establish the first globally accelerated gradient method for Riemannian manifolds.
Toward establishing our result, the first part of the thesis revisits Nesterov's AGM and develops a conceptually simple understanding of it based on the proximal point method (PPM). The main observation is that AGM is in fact an approximation of PPM, which results in simple derivations and analyses of different versions of AGM.
The second part of the thesis then extends our simple approach to the Riemannian case. In our extension, we handle a technical hurdle inherent to the Riemannian case by introducing an appropriate notion of ``metric distortion.'' We control this distortion via a novel geometric inequality, which enables us to formulate and analyze global Riemannian acceleration.
Date issued
2021-06Department
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer SciencePublisher
Massachusetts Institute of Technology