Acceleration by stepsize hedging: Silver Stepsize Schedule for smooth convex optimization
Author(s)
Altschuler, Jason M.; Parrilo, Pablo A.
Download10107_2024_Article_2164.pdf (491.3Kb)
Publisher with Creative Commons License
Publisher with Creative Commons License
Creative Commons Attribution
Terms of use
Metadata
Show full item recordAbstract
We provide a concise, self-contained proof that the Silver Stepsize Schedule proposed in our companion paper directly applies to smooth (non-strongly) convex optimization. Specifically, we show that with these stepsizes, gradient descent computes an ε -minimizer in O ( ε - log ρ 2 ) = O ( ε - 0.7864 ) iterations, where ρ = 1 + 2 is the silver ratio. This is intermediate between the textbook unaccelerated rate O ( ε - 1 ) and the accelerated rate O ( ε - 1 / 2 ) due to Nesterov in 1983. The Silver Stepsize Schedule is a simple explicit fractal: the i-th stepsize is 1 + ρ ν ( i ) - 1 where ν ( i ) is the 2-adic valuation of i. The design and analysis are conceptually identical to the strongly convex setting in our companion paper, but simplify remarkably in this specific setting.
Date issued
2024-11-25Department
Massachusetts Institute of Technology. Laboratory for Information and Decision SystemsJournal
Mathematical Programming
Publisher
Springer Berlin Heidelberg
Citation
Altschuler, J.M., Parrilo, P.A. Acceleration by stepsize hedging: Silver Stepsize Schedule for smooth convex optimization. Math. Program. (2024).
Version: Final published version