MIT Libraries logoDSpace@MIT

MIT
View Item 
  • DSpace@MIT Home
  • MIT Libraries
  • MIT Theses
  • Doctoral Theses
  • View Item
  • DSpace@MIT Home
  • MIT Libraries
  • MIT Theses
  • Doctoral Theses
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Advances in Nonconvex and Robust Optimization

Author(s)
Koukouvinos, Theodoros
Thumbnail
DownloadThesis PDF (1.232Mb)
Advisor
Bertsimas, Dimitris J.
Terms of use
In Copyright - Educational Use Permitted Copyright retained by author(s) https://rightsstatements.org/page/InC-EDU/1.0/
Metadata
Show full item record
Abstract
Nonconvex optimization presents significant challenges, as identifying the global optimum is often difficult. This thesis introduces novel algorithms to find the exact solution of a broad class of nonconvex optimization problems. The thesis is structured into four parts. In Chapter 2, we propose a novel method for solving nonconvex optimization problems, in which the nonconvex components are sums of linear times convex (SLC) functions. We introduce a new technique, called the Reformulation-Perspectification Technique (RPT), to obtain a convex approximation of the original nonconvex optimization problem. We then incorporate RPT within branch and bound to obtain the global optimal solution of the nonconvex optimization problem. By using the RPT, we obtain a convex relaxation by forming the perspective of each convex function and linearizing all product terms with newly introduced variables. To further tighten the approximation, we pairwise multiply constraints. Therefore, in Chapter 3, we analyze all possibilities of multiplying conic constraints, a very wide class of constraints. Further, we delineate methods for deriving new, valid linear and second-order cone inequalities for pairwise constraint multiplications involving the power cone and exponential cone, thereby enhancing the strength of the approximation. In Chapter 4, we address nonconvex optimization problems that involve polynomials. We derive valid SLC decompositions for polynomials, in which the linear functions are inequalities of the feasible region and the convex functions are quadratics. We prove the existence of such SLC decompositions for arbitrary degree polynomials. Further, out of the many possible SLC decompositions, we obtain the one that results in the tightest lower bound. Finally, in the numerical experiments we show that our method often outperforms state-of-the-art approaches for polynomial optimization. In Chapter 5, we propose a robust optimization framework that immunizes some of the central linear algebra problems in the presence of data uncertainty. Namely, we formulate linear systems, matrix inversion, eigenvalues-eigenvectors and matrix factorization under uncertainty, as robust optimization problems using appropriate descriptions of uncertainty. We show that for both linear systems and matrix inversion, the robust approach leads to more accurate solutions than the nominal, in the case of nearly singular matrices.
Date issued
2025-09
URI
https://hdl.handle.net/1721.1/164477
Department
Massachusetts Institute of Technology. Operations Research Center; Sloan School of Management
Publisher
Massachusetts Institute of Technology

Collections
  • Doctoral Theses

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

My Account

Login

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries
PrivacyPermissionsAccessibilityContact us
MIT
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.