Show simple item record

dc.contributor.advisorAndrew Lo.en_US
dc.contributor.authorNguyen, Tri-Dung, Ph. D. Massachusetts Institute of Technologyen_US
dc.contributor.otherMassachusetts Institute of Technology. Operations Research Center.en_US
dc.date.accessioned2010-03-24T20:38:51Z
dc.date.available2010-03-24T20:38:51Z
dc.date.copyright2009en_US
dc.date.issued2009en_US
dc.identifier.urihttp://hdl.handle.net/1721.1/52800
dc.descriptionThesis (Ph. D.)--Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2009.en_US
dc.descriptionThis electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.en_US
dc.descriptionCataloged from student-submitted PDF version of thesis.en_US
dc.descriptionIncludes bibliographical references (p. 108-112).en_US
dc.description.abstractClassical methods of maximum likelihood and least squares rely a great deal on the correctness of the model assumptions. Since these assumptions are only approximations of reality, many robust statistical methods have been developed to produce estimators that are robust against the deviation from the model assumptions. Unfortunately, these techniques have very high computational complexity that prevents their application to large scale problems. We present computationally efficient methods for robust mean-covariance estimation and robust linear regression using special mathematical programming models and semi-definite programming (SDP). In the robust covariance estimation problem, we design an optimization model with a loss function on the weighted Mahalanobis distances and show that the problem is equivalent to a system of equations and can be solved using the Newton-Raphson method. The problem can also be transformed into an SDP problem from which we can flexibly incorporate prior beliefs into the estimators without much increase in the computational complexity. The robust regression problem is often formulated as the least trimmed squares (LTS) regression problem where we want to nd the best subset of observations with the smallest sum of squared residuals. We show the LTS problem is equivalent to a concave minimization problem, which is very hard to solve. We resolve this difficulty by introducing the maximum trimmed squares" problem that finds the worst subset of observations. This problem can be transformed into an SDP problem that can be solved efficiently while still guaranteeing that we can identify outliers.en_US
dc.description.abstract(cont.) In addition, we model the robust ranking problem as a mixed integer minimax problem where the ranking is in a discrete uncertainty set. We use mixed integer programming methods, specifically column generation and network flows, to solve the robust ranking problem. To illustrate the power of these robust methods, we apply them to the mean-variance portfolio optimization problem in order to incorporate estimation errors into the model.en_US
dc.description.statementofresponsibilityby Tri-Dung Nguyen.en_US
dc.format.extent112 p.en_US
dc.language.isoengen_US
dc.publisherMassachusetts Institute of Technologyen_US
dc.rightsM.I.T. theses are protected by copyright. They may be viewed from this source for any purpose, but reproduction or distribution in any format is prohibited without written permission. See provided URL for inquiries about permission.en_US
dc.rights.urihttp://dspace.mit.edu/handle/1721.1/7582en_US
dc.subjectOperations Research Center.en_US
dc.titleRobust estimation, regression and ranking with applications in portfolio optimizationen_US
dc.typeThesisen_US
dc.description.degreePh.D.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Operations Research Center
dc.contributor.departmentSloan School of Management
dc.identifier.oclc549097897en_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record