dc.contributor.advisor | Kalyan Veeramachaneni. | en_US |
dc.contributor.author | Anderson, Alec W. (Alec Wayne) | en_US |
dc.contributor.other | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science. | en_US |
dc.date.accessioned | 2018-12-11T20:38:06Z | |
dc.date.available | 2018-12-11T20:38:06Z | |
dc.date.copyright | 2017 | en_US |
dc.date.issued | 2017 | en_US |
dc.identifier.uri | http://hdl.handle.net/1721.1/119509 | |
dc.description | Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Computer Science, 2017. | en_US |
dc.description | This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections. | en_US |
dc.description | Cataloged from student-submitted PDF version of thesis. | en_US |
dc.description | Includes bibliographical references (pages 105-108). | en_US |
dc.description.abstract | Within the automated machine learning movement, hyperparameter optimization has emerged as a particular focus. Researchers have introduced various search algorithms and open-source systems in order to automatically explore the hyperparameter space of machine learning methods. While these approaches have been effective, they also display significant shortcomings that limit their applicability to realistic data science pipelines and datasets. In this thesis, we propose an alternative theoretical and implementational approach by incorporating sampling techniques and building an end-to-end automation system, Deep Mining. We explore the application of the Bag of Little Bootstraps to the scoring statistics of pipelines, describe substantial asymptotic complexity improvements from its use, and empirically demonstrate its suitability for machine learning applications. The Deep Mining system combines a standardized approach to pipeline composition, a parallelized system for pipeline computation, and clear abstractions for incorporating realistic datasets and methods to provide hyperparameter optimization at scale. | en_US |
dc.description.statementofresponsibility | by Alec W. Anderson. | en_US |
dc.format.extent | 108 pages | en_US |
dc.language.iso | eng | en_US |
dc.publisher | Massachusetts Institute of Technology | en_US |
dc.rights | MIT theses are protected by copyright. They may be viewed, downloaded, or printed from this source but further reproduction or distribution in any format is prohibited without written permission. | en_US |
dc.rights.uri | http://dspace.mit.edu/handle/1721.1/7582 | en_US |
dc.subject | Electrical Engineering and Computer Science. | en_US |
dc.title | Deep Mining : scaling Bayesian auto-tuning of data science pipelines | en_US |
dc.title.alternative | Scaling Bayesian auto-tuning of data science pipelines | en_US |
dc.type | Thesis | en_US |
dc.description.degree | M. Eng. | en_US |
dc.contributor.department | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science | |
dc.identifier.oclc | 1066344216 | en_US |