| dc.contributor.author | Freund, Robert Michael | |
| dc.contributor.author | Lu, Haihao | |
| dc.date.accessioned | 2018-07-11T13:24:07Z | |
| dc.date.available | 2018-07-11T13:24:07Z | |
| dc.date.issued | 2017-06 | |
| dc.date.submitted | 2017-03 | |
| dc.identifier.issn | 0025-5610 | |
| dc.identifier.issn | 1436-4646 | |
| dc.identifier.uri | http://hdl.handle.net/1721.1/116877 | |
| dc.description.abstract | Motivated by recent work of Renegar, we present new computational methods and associated computational guarantees for solving convex optimization problems using first-order methods. Our problem of interest is the general convex optimization problem f[superscript ∗] = min[subscript x∈Q] f(x),where we presume knowledge of a strict lower bound f[subscript slb] < f[superscript ∗]. [Indeed, f[subscript slb] is naturally known
when optimizing many loss functions in statistics and machine learning (least-squares, logistic loss, exponential loss, total variation loss, etc.) as well as in Renegar’s transformed version of the standard conic optimization problem; in all these cases one has f[subscript slb] = 0 < f[superscript ∗]
.] We introduce a new functional measure called the growth constant G for f(·), that measures how quickly the level sets of f(·) grow relative to the function value, and that plays a fundamental role in the complexity analysis. When f(·) is non-smooth, we present new computational guarantees for the Subgradient Descent Method and for smoothing methods, that can improve existing computational guarantees in several ways, most notably when the initial iterate x[superscript 0] is far from the optimal solution set. When f(·) is smooth, we present a scheme for periodically restarting the Accelerated Gradient Method that can also improve existing computational guarantees when x[superscript 0] is far from the optimal solution set, and in the presence of added structure we present a scheme using parametrically increased smoothing that further improves the associated computational
guarantees | en_US |
| dc.publisher | Springer Berlin Heidelberg | en_US |
| dc.relation.isversionof | https://doi.org/10.1007/s10107-017-1164-1 | en_US |
| dc.rights | Creative Commons Attribution-Noncommercial-Share Alike | en_US |
| dc.rights.uri | http://creativecommons.org/licenses/by-nc-sa/4.0/ | en_US |
| dc.source | Springer Berlin Heidelberg | en_US |
| dc.title | New computational guarantees for solving convex optimization problems with first order methods, via a function growth condition measure | en_US |
| dc.type | Article | en_US |
| dc.identifier.citation | Freund, Robert M., and Haihao Lu. “New Computational Guarantees for Solving Convex Optimization Problems with First Order Methods, via a Function Growth Condition Measure.” Mathematical Programming, vol. 170, no. 2, Aug. 2018, pp. 445–77. | en_US |
| dc.contributor.department | Massachusetts Institute of Technology. Department of Mathematics | en_US |
| dc.contributor.department | Sloan School of Management | en_US |
| dc.contributor.mitauthor | Freund, Robert Michael | |
| dc.contributor.mitauthor | Lu, Haihao | |
| dc.relation.journal | Mathematical Programming | en_US |
| dc.eprint.version | Author's final manuscript | en_US |
| dc.type.uri | http://purl.org/eprint/type/JournalArticle | en_US |
| eprint.status | http://purl.org/eprint/status/PeerReviewed | en_US |
| dc.date.updated | 2018-07-04T04:43:08Z | |
| dc.language.rfc3066 | en | |
| dc.rights.holder | Springer-Verlag Berlin Heidelberg and Mathematical Optimization Society | |
| dspace.orderedauthors | Freund, Robert M.; Lu, Haihao | en_US |
| dspace.embargo.terms | N | en |
| dc.identifier.orcid | https://orcid.org/0000-0002-1733-5363 | |
| dc.identifier.orcid | https://orcid.org/0000-0002-5217-1894 | |
| mit.license | OPEN_ACCESS_POLICY | en_US |