ACTIVITIES | PERCENTAGES |
---|---|
Homework | 25% |
Midterm | 25% |
Term paper | 50% |
This is an archived course. A more recent version may be available at ocw.mit.edu.
Lectures: 2 sessions / week, 1.5 hours / session
This course will focus on fundamental subjects in (deterministic) optimization, connected through the themes of convexity, geometric multipliers, and duality. The aim is to develop the core analytical and computational issues of continuous optimization, duality, and saddle point theory using a handful of unifying principles that can be easily visualized and readily understood.
The mathematical theory of convex sets and functions will be central, and will allow an intuitive, highly visual, geometrical approach to the subject. This theory will be developed in detail and in parallel with the optimization topics.
The first part of the course develops the analytical issues of convexity and duality. The second part is devoted to convex optimization algorithms, and their applications to a variety of large-scale optimization problems from resource allocation, machine learning, engineering design, and other areas.
A course in linear algebra (such as 18.06) and a course in real analysis (such as 18.100A or 18.100B).
Basic Convexity Concepts (6 lectures): Review of linear algebra and real analysis. Convex sets and functions. Convex and affine hulls. Closure, relative interior, and continuity issues. Recession cones. Hyperplane separation. Conjugate functions.
Convexity and Optimization (3 lectures): Global and local minima. Directions of recession and existence of optimal solutions. Elementary form of duality. Saddle points and minimax theory.
Geometric Duality Theory (3 lectures): Min common/max crossing framework. Nonlinear Farkas Lemma.
Lagrangian Duality (2 lectures): Linear and quadratic programming duality. Strong duality theorems.
Fenchel Duality (1 lecture): Primal and dual Fenchel duality theorems. Monotropic programming,
Cone Programming, Semidefinite Programming (1 lecture): Conic duality. Semidefinite programming and applications.
Subgradients and Constrained Optimization Algorithms (5 lectures): Directional derivatives. Subgradients and subdifferentials. Subgradient and polyhedral approximation methods. Proximal algorithms.
Gradient Projection, Incremental Methods, Optimal Algorithms (3 lectures): Incremental gradient, subgradient, and proximal methods. Complexity of algorithms for convex differentiable and nondifferentiable problems. Gradient methods with extrapolation, and Nesterov's method.
Bertsekas, Dimitri. Convex Optimization Theory. Athena Scientific, 2009. ISBN: 9781886529311. Supplementary materials, including Chapter 6 (Convex Optimization Algorithms), can be found here.
A summary of theoretical concepts and results from the textbook is provided here: (PDF) (Courtesy of Athena Scientific. Used with permission.)
ACTIVITIES | PERCENTAGES |
---|---|
Homework | 25% |
Midterm | 25% |
Term paper | 50% |