Show simple item record

dc.contributor.authorWang, Mengdi
dc.contributor.authorBertsekas, Dimitri P.
dc.date.accessioned2016-07-20T16:52:35Z
dc.date.available2016-07-20T16:52:35Z
dc.date.issued2016-03
dc.date.submitted2015-12
dc.identifier.issn1052-6234
dc.identifier.issn1095-7189
dc.identifier.urihttp://hdl.handle.net/1721.1/103776
dc.description.abstractWe consider convex optimization problems with structures that are suitable for sequential treatment or online sampling. In particular, we focus on problems where the objective function is an expected value, and the constraint set is the intersection of a large number of simpler sets. We propose an algorithmic framework for stochastic first-order methods using random projection/proximal updates and random constraint updates, which contain as special cases several known algorithms as well as many new algorithms. To analyze the convergence of these algorithms in a unified manner, we prove a general coupled convergence theorem. It states that the convergence is obtained from an interplay between two coupled processes: progress toward feasibility and progress toward optimality. Under suitable stepsize assumptions, we show that the optimality error decreases at a rate of $\mathcal{O}(1/\sqrt{k})$ and the feasibility error decreases at a rate of $\mathcal{O}(\log k/k)$. We also consider a number of typical sampling processes for generating stochastic first-order information and random constraints, which are common in data-intensive applications, online learning, and simulation optimization. By using the coupled convergence theorem as a modular architecture, we are able to analyze the convergence of stochastic algorithms that use arbitrary combinations of these sampling processes.en_US
dc.description.sponsorshipUnited States. Air Force (grant FA9550-10-1-0412)en_US
dc.language.isoen_US
dc.publisherSociety for Industrial and Applied Mathematics (SIAM)en_US
dc.relation.isversionofhttp://dx.doi.org/10.1137/130931278en_US
dc.rightsArticle is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use.en_US
dc.sourceSociety for Industrial and Applied Mathematicsen_US
dc.titleStochastic First-Order Methods with Random Constraint Projectionen_US
dc.typeArticleen_US
dc.identifier.citationWang, Mengdi, and Dimitri P. Bertsekas. “Stochastic First-Order Methods with Random Constraint Projection.” SIAM Journal on Optimization 26, no. 1 (January 2016): 681–717.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Scienceen_US
dc.contributor.departmentMassachusetts Institute of Technology. Laboratory for Information and Decision Systemsen_US
dc.contributor.mitauthorBertsekas, Dimitri P.en_US
dc.relation.journalSIAM Journal on Optimizationen_US
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/PeerRevieweden_US
dspace.orderedauthorsWang, Mengdi; Bertsekas, Dimitri P.en_US
dspace.embargo.termsNen_US
dc.identifier.orcidhttps://orcid.org/0000-0001-6909-7208
mit.licensePUBLISHER_POLICYen_US
mit.metadata.statusComplete


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record