Factor-√2 Acceleration of Accelerated Gradient Methods
Author(s)
Park, Chanwoo; Park, Jisun; Ryu, Ernest K.
Download245_2023_10047_ReferencePDF.pdf (814.5Kb)
Publisher Policy
Publisher Policy
Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.
Terms of use
Metadata
Show full item recordAbstract
Abstract
The optimized gradient method (OGM) provides a factor-
$$\sqrt{2}$$
2
speedup upon Nesterov’s celebrated accelerated gradient method in the convex (but non-strongly convex) setup. However, this improved acceleration mechanism has not been well understood; prior analyses of OGM relied on a computer-assisted proof methodology, so the proofs were opaque for humans despite being verifiable and correct. In this work, we present a new analysis of OGM based on a Lyapunov function and linear coupling. These analyses are developed and presented without the assistance of computers and are understandable by humans. Furthermore, we generalize OGM’s acceleration mechanism and obtain a factor-
$$\sqrt{2}$$
2
speedup in other setups: acceleration with a simpler rational stepsize, the strongly convex setup, and the mirror descent setup.
Date issued
2023-08-23Department
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science; Massachusetts Institute of Technology. Laboratory for Information and Decision SystemsPublisher
Springer US
Citation
Applied Mathematics & Optimization. 2023 Aug 23;88(3):77
Version: Author's final manuscript