Show simple item record

dc.contributor.advisorBarzilay, Regina
dc.contributor.advisorJaakkola, Tommi
dc.contributor.authorFisch, Adam
dc.date.accessioned2023-11-02T20:16:18Z
dc.date.available2023-11-02T20:16:18Z
dc.date.issued2023-09
dc.date.submitted2023-09-21T14:26:21.483Z
dc.identifier.urihttps://hdl.handle.net/1721.1/152788
dc.description.abstractDeep learning has seen exciting progress over the last decade. As large foundation models continue to evolve and be deployed into real-life applications, an important question to ask is how we can make these expensive, inscrutable models more efficient and reliable. In this thesis, we present a number of fundamental techniques for building and deploying effective deep learning systems that are broadly based on conformal prediction, a model-agnostic and distribution-free uncertainty estimation framework. We develop both theory and practice for leveraging uncertainty estimation to build adaptive models that are cheaper to run, have desirable performance guarantees, and are general enough to work well in many real-world scenarios. Empirically, we primarily focus on natural language processing (NLP) applications, together with substantial extensions to tasks in computer vision, drug discovery, and medicine.
dc.publisherMassachusetts Institute of Technology
dc.rightsIn Copyright - Educational Use Permitted
dc.rightsCopyright retained by author(s)
dc.rights.urihttps://rightsstatements.org/page/InC-EDU/1.0/
dc.titleConformal Methods for Efficient and Reliable Deep Learning
dc.typeThesis
dc.description.degreePh.D.
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
mit.thesis.degreeDoctoral
thesis.degree.nameDoctor of Philosophy


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record