Show simple item record

dc.contributor.authorKashimura, Takuya.en_US
dc.contributor.otherMassachusetts Institute of Technology. Engineering Systems Division.en_US
dc.contributor.otherSystem Design and Management Program.en_US
dc.date.accessioned2022-08-31T16:29:22Z
dc.date.available2022-08-31T16:29:22Z
dc.date.copyright2020en_US
dc.date.issued2020en_US
dc.identifier.urihttps://hdl.handle.net/1721.1/145230
dc.descriptionThesis: S.M. in Engineering and Management, Massachusetts Institute of Technology, Engineering Systems Division, System Design and Management Program, 2020en_US
dc.descriptionCataloged from PDF version of thesis.en_US
dc.descriptionIncludes bibliographical references (pages 69-72).en_US
dc.description.abstractIn this paper, we study uncertainty in machine learning and deep learning from the mathematical point of view. Uncertainty is involved in many real-world situations. The Bayesian modelling can handle such uncertainty in machine learning community. However, the traditional deep learning model fails to show uncertainty for its outputs. Recently, at the intersection of the Bayesian modelling and deep learning, a new framework called the Bayesian deep learning (BDL) has been proposed and studied, which enables us to estimate uncertainty of deep learning models. As an example of it, we can review the results of Yarin Gal, in which the famous dropout method can be seen as a Bayesian modelling. We also see that overfitting problem of the framework due to the property of the KL divergence, and review the modified algorithm using o-divergence which generalizes the KL divergence. We also study a confidence band to assess uncertainty of a kernel ridge regression estimator. We propose the formulation to obtain a confidence band as the convex optimization, which enables us to use existing algorithms such as the primal-dual inner point method. The proposed method acquires a more accurate and fast confidence band than a bootstrap algorithm. We also see the effectiveness of our proposed method both in the case of function approximation and an estimate of an actual dataset.en_US
dc.description.statementofresponsibilityby Takuya Kashimura.en_US
dc.format.extent72 pagesen_US
dc.language.isoengen_US
dc.publisherMassachusetts Institute of Technologyen_US
dc.rightsMIT theses may be protected by copyright. Please reuse MIT thesis content according to the MIT Libraries Permissions Policy, which is available through the URL provided.en_US
dc.rights.urihttp://dspace.mit.edu/handle/1721.1/7582en_US
dc.subjectEngineering Systems Division.en_US
dc.subjectSystem Design and Management Program.en_US
dc.titleMathematical analysis of uncertainty in machine learning and deep learningen_US
dc.typeThesisen_US
dc.description.degreeS.M. in Engineering and Managementen_US
dc.contributor.departmentMassachusetts Institute of Technology. Engineering Systems Divisionen_US
dc.contributor.departmentSystem Design and Management Program.en_US
dc.identifier.oclc1341996474en_US
dc.description.collectionS.M. in Engineering and Management Massachusetts Institute of Technology, Engineering Systems Division, System Design and Management Programen_US
dspace.imported2022-08-31T16:29:22Zen_US
mit.thesis.degreeMasteren_US
mit.thesis.departmentSloanen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record