Stochastic methods for large-scale linear problems, variational inequalities, and convex optimization
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science.
Dimitri P. Bertsekas.
MetadataShow full item record
This thesis considers stochastic methods for large-scale linear systems, variational inequalities, and convex optimization problems. I focus on special structures that lend themselves to sampling, such as when the linear/nonlinear mapping or the objective function is an expected value or is the sum of a large number of terms, and/or the constraint is the intersection of a large number of simpler sets. For linear systems, I propose modifications to deterministic methods to allow the use of random samples and maintain the stochastic convergence, which is particularly challenging when the unknown system is singular or nearly singular. For variational inequalities and optimization problems, I propose a class of methods that combine elements of incremental constraint projection, stochastic gradient/ subgradient descent, and proximal algorithm. These methods can be applied with various sampling schemes that are suitable for applications involving distributed implementation, large data set, or online learning. I use a unified framework to analyze the convergence and the rate of convergence of these methods. This framework is based on a pair of supermartingale bounds, which control the convergence to feasibility and the convergence to optimality, respectively, and are coupled at different time scales.
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2013.Cataloged from PDF version of thesis.Includes bibliographical references (p. 201-207).
DepartmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Science.
Massachusetts Institute of Technology
Electrical Engineering and Computer Science.