Conditional gradient methods via stochastic path-integrated differential estimator
Author(s)
Sra, Suvrit
DownloadPublished version (271.8Kb)
Publisher Policy
Publisher Policy
Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.
Terms of use
Metadata
Show full item recordAbstract
We propose a class of novel variance-reduced stochastic conditional gradient methods. By adopting the recent stochastic path-integrated differential estimator technique (SPIDER) of Fang ct al. (2018) for the classical Frank-Wolfe (FW) method, we introduce SPIDER-FW for finite-sum minimization as well as the more general expectation minimization problems. SPIDER-FW enjoys superior complexity guarantees in the non-convex setting, while matching the best known FW variants in the convex case. We also extend our framework à la conditional gradient sliding (CGS) of Lan & Zhou (2016), and propose SPIDER-CGS.
Date issued
2019-06Department
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer ScienceJournal
Proceedings of Machine Learning Research
Publisher
International Machine Learning Society
Citation
Yurtsever, Alp et al. “Conditional gradient methods via stochastic path-integrated differential estimator.” Paper in the Proceedings of Machine Learning Research, 97, 36th International Conference on Machine Learning ICML 2019, Long Beach, California, 9-15 June 2019, American Society of Mechanical Engineers: 7282-7291 © 2019 The Author(s)
Version: Final published version
ISSN
2640-3498