Advances in Nonconvex and Robust Optimization
Author(s)
Koukouvinos, Theodoros
DownloadThesis PDF (1.232Mb)
Advisor
Bertsimas, Dimitris J.
Terms of use
Metadata
Show full item recordAbstract
Nonconvex optimization presents significant challenges, as identifying the global optimum is often difficult. This thesis introduces novel algorithms to find the exact solution of a broad class of nonconvex optimization problems. The thesis is structured into four parts. In Chapter 2, we propose a novel method for solving nonconvex optimization problems, in which the nonconvex components are sums of linear times convex (SLC) functions. We introduce a new technique, called the Reformulation-Perspectification Technique (RPT), to obtain a convex approximation of the original nonconvex optimization problem. We then incorporate RPT within branch and bound to obtain the global optimal solution of the nonconvex optimization problem. By using the RPT, we obtain a convex relaxation by forming the perspective of each convex function and linearizing all product terms with newly introduced variables. To further tighten the approximation, we pairwise multiply constraints. Therefore, in Chapter 3, we analyze all possibilities of multiplying conic constraints, a very wide class of constraints. Further, we delineate methods for deriving new, valid linear and second-order cone inequalities for pairwise constraint multiplications involving the power cone and exponential cone, thereby enhancing the strength of the approximation. In Chapter 4, we address nonconvex optimization problems that involve polynomials. We derive valid SLC decompositions for polynomials, in which the linear functions are inequalities of the feasible region and the convex functions are quadratics. We prove the existence of such SLC decompositions for arbitrary degree polynomials. Further, out of the many possible SLC decompositions, we obtain the one that results in the tightest lower bound. Finally, in the numerical experiments we show that our method often outperforms state-of-the-art approaches for polynomial optimization. In Chapter 5, we propose a robust optimization framework that immunizes some of the central linear algebra problems in the presence of data uncertainty. Namely, we formulate linear systems, matrix inversion, eigenvalues-eigenvectors and matrix factorization under uncertainty, as robust optimization problems using appropriate descriptions of uncertainty. We show that for both linear systems and matrix inversion, the robust approach leads to more accurate solutions than the nominal, in the case of nearly singular matrices.
Date issued
2025-09Department
Massachusetts Institute of Technology. Operations Research Center; Sloan School of ManagementPublisher
Massachusetts Institute of Technology