Random shuffling beats SGD after finite epochs
Author(s)
HaoChen, Jeff; Sra, Suvrit
DownloadPublished version (307.7Kb)
Publisher Policy
Publisher Policy
Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.
Terms of use
Metadata
Show full item recordAbstract
A long-standing problem in optimization is proving that RANDOMSHUFFLE, the without-replacement version of SGD, converges faster than (the usual) with-replacement SGD. Building upon (Giirbiizbalaban et al., 2015b), we present the first non-asymptotic results for this problem, proving that after a reasonable number of epochs RANDOMSHUFFLE converges faster than SGD. Specifically, we prove that for strongly convex, second-order smooth functions, the iterates of RANDOMSHUFFLE converge to the optimal solution as O(1/T2 + n3/r3), where n is the number of components in the objective, and T is number of iterations. This result implies that after O (√n) epochs, RANDOMSHUFFLE is strictly better than SGD (which converges as O(1/T)). The key step toward showing this better dependence on T is the introduction of n into the bound; and as our analysis shows, in general a dependence on n is unavoidable without further changes. To understand how RANDOMSHUFFLE works in practice, we further explore two valuable settings: data sparsity and over-parameterization. For sparse data, RANDOMSHUFFLE has the rate Ö (/r2), aea strictly better than SGD. Under a setting closely related to over-parameterization, RANDOMSHUFFLE is shown to converge faster than SGD after any arbitrary number of iterations. Finally, we extend the analysis of RANDOMSHUFFLE to smooth convex and some non-convex functions.
Date issued
2019-06Department
Massachusetts Institute of Technology. Institute for Data, Systems, and Society; Massachusetts Institute of Technology. Department of Electrical Engineering and Computer ScienceJournal
36th International Conference on Machine Learning, ICML 2019
Citation
HaoChen, Jeff and Sra, Suvrit. 2019. "Random shuffling beats SGD after finite epochs." 36th International Conference on Machine Learning, ICML 2019, 2019-June.
Version: Final published version