dc.contributor.advisor | Dimitris Bertsimas. | en_US |
dc.contributor.author | Sturt, Bradley Eli. | en_US |
dc.contributor.other | Massachusetts Institute of Technology. Operations Research Center. | en_US |
dc.date.accessioned | 2020-09-15T21:50:35Z | |
dc.date.available | 2020-09-15T21:50:35Z | |
dc.date.copyright | 2020 | en_US |
dc.date.issued | 2020 | en_US |
dc.identifier.uri | https://hdl.handle.net/1721.1/127292 | |
dc.description | Thesis: Ph. D., Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, May, 2020 | en_US |
dc.description | Cataloged from the official PDF of thesis. | en_US |
dc.description | Includes bibliographical references (pages 241-249). | en_US |
dc.description.abstract | This thesis revisits a fundamental class of dynamic optimization problems introduced by Dantzig (1955). These decision problems remain widely studied in many applications domains (e.g., inventory management, finance, energy planning) but require access to probability distributions that are rarely known in practice. First, we propose a new data-driven approach for addressing multi-stage stochastic linear optimization problems with unknown probability distributions. The approach consists of solving a robust optimization problem that is constructed from sample paths of the underlying stochastic process. As more sample paths are obtained, we prove that the optimal cost of the robust problem converges to that of the underlying stochastic problem. To the best of our knowledge, this is the first data-driven approach for multi-stage stochastic linear optimization problems which is asymptotically optimal when uncertainty is arbitrarily correlated across time. | en_US |
dc.description.abstract | Next, we develop approximation algorithms for the proposed data-driven approach by extending techniques from the field of robust optimization. In particular, we present a simple approximation algorithm, based on overlapping linear decision rules, which can be reformulated as a tractable linear optimization problem with size that scales linearly in the number of data points. For two-stage problems, we show the approximation algorithm is also asymptotically optimal, meaning that the optimal cost of the approximation algorithm converges to that of the underlying stochastic problem as the number of data points tends to infinity. Finally, we extend the proposed data-driven approach to address multi-stage stochastic linear optimization problems with side information. The approach combines predictive machine learning methods (such as K-nearest neighbors, kernel regression, and random forests) with the proposed robust optimization framework. | en_US |
dc.description.abstract | We prove that this machine learning-based approach is asymptotically optimal, and demonstrate the value of the proposed methodology in numerical experiments in the context of inventory management, scheduling, and finance. | en_US |
dc.description.statementofresponsibility | by Bradley Eli Sturt. | en_US |
dc.format.extent | 249 pages | en_US |
dc.language.iso | eng | en_US |
dc.publisher | Massachusetts Institute of Technology | en_US |
dc.rights | MIT theses may be protected by copyright. Please reuse MIT thesis content according to the MIT Libraries Permissions Policy, which is available through the URL provided. | en_US |
dc.rights.uri | http://dspace.mit.edu/handle/1721.1/7582 | en_US |
dc.subject | Operations Research Center. | en_US |
dc.title | Dynamic optimization in the age of big data | en_US |
dc.type | Thesis | en_US |
dc.description.degree | Ph. D. | en_US |
dc.contributor.department | Massachusetts Institute of Technology. Operations Research Center | en_US |
dc.contributor.department | Sloan School of Management | |
dc.identifier.oclc | 1191900773 | en_US |
dc.description.collection | Ph.D. Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center | en_US |
dspace.imported | 2020-09-15T21:50:34Z | en_US |
mit.thesis.degree | Doctoral | en_US |
mit.thesis.department | Sloan | en_US |
mit.thesis.department | OperRes | en_US |