dc.contributor.author | Wang, Mengdi | |
dc.contributor.author | Bertsekas, Dimitri P. | |
dc.date.accessioned | 2016-07-20T16:52:35Z | |
dc.date.available | 2016-07-20T16:52:35Z | |
dc.date.issued | 2016-03 | |
dc.date.submitted | 2015-12 | |
dc.identifier.issn | 1052-6234 | |
dc.identifier.issn | 1095-7189 | |
dc.identifier.uri | http://hdl.handle.net/1721.1/103776 | |
dc.description.abstract | We consider convex optimization problems with structures that are suitable for sequential treatment or online sampling. In particular, we focus on problems where the objective function is an expected value, and the constraint set is the intersection of a large number of simpler sets. We propose an algorithmic framework for stochastic first-order methods using random projection/proximal updates and random constraint updates, which contain as special cases several known algorithms as well as many new algorithms. To analyze the convergence of these algorithms in a unified manner, we prove a general coupled convergence theorem. It states that the convergence is obtained from an interplay between two coupled processes: progress toward feasibility and progress toward optimality. Under suitable stepsize assumptions, we show that the optimality error decreases at a rate of $\mathcal{O}(1/\sqrt{k})$ and the feasibility error decreases at a rate of $\mathcal{O}(\log k/k)$. We also consider a number of typical sampling processes for generating stochastic first-order information and random constraints, which are common in data-intensive applications, online learning, and simulation optimization. By using the coupled convergence theorem as a modular architecture, we are able to analyze the convergence of stochastic algorithms that use arbitrary combinations of these sampling processes. | en_US |
dc.description.sponsorship | United States. Air Force (grant FA9550-10-1-0412) | en_US |
dc.language.iso | en_US | |
dc.publisher | Society for Industrial and Applied Mathematics (SIAM) | en_US |
dc.relation.isversionof | http://dx.doi.org/10.1137/130931278 | en_US |
dc.rights | Article is made available in accordance with the publisher's policy and may be subject to US copyright law. Please refer to the publisher's site for terms of use. | en_US |
dc.source | Society for Industrial and Applied Mathematics | en_US |
dc.title | Stochastic First-Order Methods with Random Constraint Projection | en_US |
dc.type | Article | en_US |
dc.identifier.citation | Wang, Mengdi, and Dimitri P. Bertsekas. “Stochastic First-Order Methods with Random Constraint Projection.” SIAM Journal on Optimization 26, no. 1 (January 2016): 681–717. | en_US |
dc.contributor.department | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science | en_US |
dc.contributor.department | Massachusetts Institute of Technology. Laboratory for Information and Decision Systems | en_US |
dc.contributor.mitauthor | Bertsekas, Dimitri P. | en_US |
dc.relation.journal | SIAM Journal on Optimization | en_US |
dc.eprint.version | Final published version | en_US |
dc.type.uri | http://purl.org/eprint/type/JournalArticle | en_US |
eprint.status | http://purl.org/eprint/status/PeerReviewed | en_US |
dspace.orderedauthors | Wang, Mengdi; Bertsekas, Dimitri P. | en_US |
dspace.embargo.terms | N | en_US |
dc.identifier.orcid | https://orcid.org/0000-0001-6909-7208 | |
mit.license | PUBLISHER_POLICY | en_US |
mit.metadata.status | Complete | |