Show simple item record

dc.contributor.advisorDimitris J. Bertsimas.en_US
dc.contributor.authorFertis, Apostolosen_US
dc.contributor.otherMassachusetts Institute of Technology. Dept. of Electrical Engineering and Computer Science.en_US
dc.date.accessioned2010-03-25T15:23:48Z
dc.date.available2010-03-25T15:23:48Z
dc.date.copyright2009en_US
dc.date.issued2009en_US
dc.identifier.urihttp://hdl.handle.net/1721.1/53270
dc.descriptionThesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2009.en_US
dc.descriptionCataloged from PDF version of thesis.en_US
dc.descriptionIncludes bibliographical references (p. 87-91).en_US
dc.description.abstractThere have long been intuitive connections between robustness and regularization in statistical estimation, for example, in lasso and support vector machines. In the first part of the thesis, we formalize these connections using robust optimization. Specifically (a) We show that in classical regression, regularized estimators like lasso can be derived by applying robust optimization to the classical least squares problem. We discover the explicit connection between the size and the structure of the uncertainty set used in the robust estimator, with the coefficient and the kind of norm used in regularization. We compare the out-of-sample performance of the nominal and the robust estimators in computer generated and real data. (b) We prove that the support vector machines estimator is also a robust estimator of some nominal classification estimator (this last fact was also observed independently and simultaneously by Xu, Caramanis, and Mannor [52]). We generalize the support vector machines estimator by considering several sizes and structures for the uncertainty sets, and proving that the respective max-min optimization problems can be expressed as regularization problems. In the second part of the thesis, we turn our attention to constructing robust maximum likelihood estimators. Specifically (a) We define robust estimators for the logistic regression model, taking into consideration uncertainty in the independent variables, in the response variable, and in both. We consider several structures for the uncertainty sets, and prove that, in all cases, they lead to convex optimization problems. We provide efficient algorithms to compute the estimates in all cases.en_US
dc.description.abstract(cont.) We report on the out-of-sample performance of the robust, as well as the nominal estimators in both computer generated and real data sets, and conclude that the robust estimators achieve a higher success rate. (b) We develop a robust maximum likelihood estimator for the multivariate normal distribution by considering uncertainty sets for the data used to produce it. We develop an efficient first order gradient descent method to compute the estimate and compare the efficiency of the robust estimate to the respective nominal one in computer generated data.en_US
dc.format.extent91 p.en_US
dc.language.isoengen_US
dc.publisherMassachusetts Institute of Technologyen_US
dc.rightsM.I.T. theses are protected by copyright. They may be viewed from this source for any purpose, but reproduction or distribution in any format is prohibited without written permission. See provided URL for inquiries about permission.en_US
dc.rights.urihttp://dspace.mit.edu/handle/1721.1/7582en_US
dc.subjectElectrical Engineering and Computer Science.en_US
dc.titleA robust optimization approach to statistical estimation problems by Apostolos G. Fertis.en_US
dc.typeThesisen_US
dc.description.degreePh.D.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
dc.identifier.oclc547116970en_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record