Show simple item record

dc.contributor.authorFreund, Robert Michael
dc.contributor.authorLu, Haihao
dc.date.accessioned2018-07-11T13:24:07Z
dc.date.available2018-07-11T13:24:07Z
dc.date.issued2017-06
dc.date.submitted2017-03
dc.identifier.issn0025-5610
dc.identifier.issn1436-4646
dc.identifier.urihttp://hdl.handle.net/1721.1/116877
dc.description.abstractMotivated by recent work of Renegar, we present new computational methods and associated computational guarantees for solving convex optimization problems using first-order methods. Our problem of interest is the general convex optimization problem f[superscript ∗] = min[subscript x∈Q] f(x),where we presume knowledge of a strict lower bound f[subscript slb] < f[superscript ∗]. [Indeed, f[subscript slb] is naturally known when optimizing many loss functions in statistics and machine learning (least-squares, logistic loss, exponential loss, total variation loss, etc.) as well as in Renegar’s transformed version of the standard conic optimization problem; in all these cases one has f[subscript slb] = 0 < f[superscript ∗] .] We introduce a new functional measure called the growth constant G for f(·), that measures how quickly the level sets of f(·) grow relative to the function value, and that plays a fundamental role in the complexity analysis. When f(·) is non-smooth, we present new computational guarantees for the Subgradient Descent Method and for smoothing methods, that can improve existing computational guarantees in several ways, most notably when the initial iterate x[superscript 0] is far from the optimal solution set. When f(·) is smooth, we present a scheme for periodically restarting the Accelerated Gradient Method that can also improve existing computational guarantees when x[superscript 0] is far from the optimal solution set, and in the presence of added structure we present a scheme using parametrically increased smoothing that further improves the associated computational guaranteesen_US
dc.publisherSpringer Berlin Heidelbergen_US
dc.relation.isversionofhttps://doi.org/10.1007/s10107-017-1164-1en_US
dc.rightsCreative Commons Attribution-Noncommercial-Share Alikeen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/4.0/en_US
dc.sourceSpringer Berlin Heidelbergen_US
dc.titleNew computational guarantees for solving convex optimization problems with first order methods, via a function growth condition measureen_US
dc.typeArticleen_US
dc.identifier.citationFreund, Robert M., and Haihao Lu. “New Computational Guarantees for Solving Convex Optimization Problems with First Order Methods, via a Function Growth Condition Measure.” Mathematical Programming, vol. 170, no. 2, Aug. 2018, pp. 445–77.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Mathematicsen_US
dc.contributor.departmentSloan School of Managementen_US
dc.contributor.mitauthorFreund, Robert Michael
dc.contributor.mitauthorLu, Haihao
dc.relation.journalMathematical Programmingen_US
dc.eprint.versionAuthor's final manuscripten_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/PeerRevieweden_US
dc.date.updated2018-07-04T04:43:08Z
dc.language.rfc3066en
dc.rights.holderSpringer-Verlag Berlin Heidelberg and Mathematical Optimization Society
dspace.orderedauthorsFreund, Robert M.; Lu, Haihaoen_US
dspace.embargo.termsNen
dc.identifier.orcidhttps://orcid.org/0000-0002-1733-5363
dc.identifier.orcidhttps://orcid.org/0000-0002-5217-1894
mit.licenseOPEN_ACCESS_POLICYen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record