Show simple item record

dc.contributor.advisorFreund, Robert M.
dc.contributor.authorXiong, Zikai
dc.date.accessioned2025-07-29T17:20:01Z
dc.date.available2025-07-29T17:20:01Z
dc.date.issued2025-05
dc.date.submitted2025-05-23T18:10:18.815Z
dc.identifier.urihttps://hdl.handle.net/1721.1/162139
dc.description.abstractIn the last several years there has been a dramatic shift in the way many large-scale linear programs (LPs) are solved in practice, with classic methods (simplex method and interior-point methods) being replaced by the primal-dual hybrid gradient method (PDHG) to solve large-scale LP problems. While PDHG---with heuristic enhancements and GPU implementation---has been very successful in solving large-scale LP problems, its performance can have substantial variance and an intuitive understanding of the drivers of its performance has been lacking. In this context the research in this thesis has three related goals: (i) the development of new theory to explain the performance of PDHG for large-scale LPs, (ii) the development of new practical methods for solving large-scale LP problems based on PDHG, and (iii) the generalization of such new theory and new practical methods to the more general class of conic optimization problems. The thesis is organized as follows. Chapter 1 is an introduction and a unified summary of the thesis research as a whole. Chapter 2 presents computational guarantees for PDHG for solving LP problems based on two instance-dependent natural geometric condition measures, namely the "limiting error ratio" and the "LP sharpness." The connection between these condition measures and other LP condition numbers is also developed. Chapter 3 presents computational guarantees for more general conic optimization problems using the geometry of the primal-dual (sub)level sets. Based on our analysis we propose a central-path Hessian-based rescaling to enhance algorithmic performance by improving the (sub)level set geometry. We present computational results that show the potential of our methodology to improve the performance of PDHG in practice. Chapter 4 presents a closed-form expression of the iteration complexity of PDHG for LP instances with unique optima. The iteration bound has a reciprocal relationship with (i) stability under data perturbation, (ii) proximity to multiple optima, and (iii) LP sharpness. Chapter 5 considers the iteration complexity of LP instances under a sub-Gaussian model of instance generation. In this model we show that PDHG is a polynomial-time algorithm with high probability. This result partially shrinks the gap between theory and practice for PDHG by showing that PDHG can solve "most" LP instances in polynomial time. Finally, Chapter 6 presents a practical PDHG-based large-scale conic optimization solver with GPU enhancements. In this chapter we present computational experiments that show that the solver is more efficient than other first-order methods and commercial solvers for large-scale conic optimization problems.
dc.publisherMassachusetts Institute of Technology
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0)
dc.rightsCopyright retained by author(s)
dc.rights.urihttps://creativecommons.org/licenses/by-nc-nd/4.0/
dc.titleNew Theory and New Practical Methods for Solving Large-Scale Linear and Conic Optimization
dc.typeThesis
dc.description.degreePh.D.
dc.contributor.departmentMassachusetts Institute of Technology. Operations Research Center
dc.contributor.departmentSloan School of Management
dc.identifier.orcidhttps://orcid.org/0000-0003-3025-7846
mit.thesis.degreeDoctoral
thesis.degree.nameDoctor of Philosophy


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record