Global Convergence Rate of Proximal Incremental Aggregated Gradient Methods
Author(s)
Vanli, Nuri Denizcan; Gurbuzbalaban, Mert; Ozdaglar, Asuman E
DownloadPublished version (435.3Kb)
Terms of use
Metadata
Show full item recordAbstract
We focus on the problem of minimizing the sum of smooth component functions (where the sum is strongly convex) and a nonsmooth convex function, which arises in regularized empirical risk minimization in machine learning and distributed constrained optimization in wireless sensor networks and smart grids. We consider solving this problem using the proximal incremental aggregated gradient (PIAG) method, which at each iteration moves along an aggregated gradient (formed by incrementally updating gradients of component functions according to a deterministic order) and takes a proximal step with respect to the nonsmooth function. While the convergence properties of this method with randomized orders (in updating gradients of component functions) have been investigated, this paper, to the best of our knowledge, is the first study that establishes the convergence rate properties of the PIAG method for any deterministic order. In particular, we show that the PIAG algorithm is globally convergent with a linear rate provided that the step size is sufficiently small. We explicitly identify the rate of convergence and the corresponding step size to achieve this convergence rate. Our results improve upon the best known condition number and gradient delay bound dependence of the convergence rate of the incremental aggregated gradient methods used for minimizing a sum of smooth functions. Key words. convex optimization, nonsmooth optimization, proximal incremental aggregated gradient method
Date issued
2018-05-18Department
Massachusetts Institute of Technology. Laboratory for Information and Decision Systems; Massachusetts Institute of Technology. Department of Electrical Engineering and Computer SciencePublisher
Society for Industrial & Applied Mathematics (SIAM)
Citation
Vanli, N. D., et al. “Global Convergence Rate of Proximal Incremental Aggregated Gradient Methods.” SIAM Journal on Optimization 28, no. 2, (January 2018): 1282–300. © 2018 Society for Industrial and Applied Mathematics
Version: Final published version
ISSN
1052-6234
1095-7189