Pseudonormality and a language multiplier theory for constrained optimization
Author(s)Ozdaglar, Asuman E
Massachusetts Institute of Technology. Dept. of Electrical Engineering and Computer Science.
Dimitri P. Bertsekas.
MetadataShow full item record
Lagrange multipliers are central to analytical and computational studies in linear and non-linear optimization and have applications in a wide variety of fields, including communication, networking, economics, and manufacturing. In the past, the main research in Lagrange multiplier theory has focused on developing general and easily verifiable conditions on the constraint set, called constraint qualifications, that guarantee the existence of Lagrange multipliers for the optimization problem of interest. In this thesis, we present a new development of Lagrange multiplier theory that significantly differs from the classical treatments. Our objective is to generalize, unify, and streamline the theory of constraint qualifications. As a starting point, we derive an enahanced set of necessary optimality conditions of the Fritz John-type, which are stronger than the classical Karush-Kuhn-Tucker conditions. They are also more general in that they apply even when there is a possibly nonconvex abstract set constraint, in addition to smooth equality and inequality constraints. These optimality conditions motivate the introduction of a new condition, called pseudonormality, which emerges as central within the taxonomy of significant characteristics of a constraint set. In particular, pseudonormality unifies and extends the major constraint qualifications. In addition, pseudonormality provides the connecting link between constraint qualifications and exact penalty functions. Our analysis also yields identification of different types of Lagrange multipliers. Under some convexity assumptions, we show that there exists a special Lagrange multiplier vector, called informative, which carries significant sensitivity information regarding the constraints that directly affect the optimal cost change.(cont.) In the second part of the thesis, we extend the theory to nonsmooth problems under convexity assumptions. We introduce another notion of multiplier, called geometric, that is not tied to a specific optimal solution and does not require differentiability of the cost and constraint functions. Using a line of development based on convex analysis, we develop Fritz John-type optimality conditions for problems that do not necessarily have optimal solutions. Through an extended notion of constraint pseudonormality, this development provides an alternative pathway to strong duality results of convex programming. We also introduce special geometric multipliers that carry sensitivity information and show their existence under very general conditions.
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2003.Includes bibliographical references (leaves 211-213).
DepartmentMassachusetts Institute of Technology. Dept. of Electrical Engineering and Computer Science.
Massachusetts Institute of Technology
Electrical Engineering and Computer Science.