Flows, Submodularity, Sparsity, and Beyond: Continuous Optimization Insights for Discrete Problems
Author(s)
Axiotis, Kyriakos
DownloadThesis PDF (9.955Mb)
Advisor
Mądry, Aleksander
Terms of use
Metadata
Show full item recordAbstract
In this thesis we build on connections between discrete and continuous optimization. In the first part of the thesis we propose faster second-order convex optimization algorithms for classical graph algorithmic problems. Our main contribution is to show that the runtime of interior point methods is closely connected to spectral connectivity notions in the underlying graph, such as electrical conductance and effective resistance. We explore these connections along two orthogonal directions: Making manual interventions to the graph to improve connectivity, or keeping track of connectivity so as to make faster updates. These ideas lead to the first runtime improvement for the minimum cost flow problem in more than 10 years, as well as faster algorithms for problems like negative-weight shortest path and minimum cost perfect matching.
In the second part of the thesis, we investigate efficient optimization algorithms for problems relevant to machine learning that have some discrete element, such as sparse or low rank structure. We introduce a new technique, called adaptive regularization, which eliminates the sparsity performance degradation caused by ℓ₂ projections onto structured non-convex domains, like the set of sparse vectors or low rank matrices. This leads to improving the sparsity guarantee of one of the most well known sparse optimization algorithms, IHT.
Date issued
2022-09Department
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer SciencePublisher
Massachusetts Institute of Technology