Scalable Representation Learning: On Data-scarcity, Uncertainty and Symmetry
Author(s)
Loh, Charlotte Chang Le
DownloadThesis PDF (33.08Mb)
Advisor
Soljačić, Marin
Terms of use
Metadata
Show full item recordAbstract
Deep learning has experienced remarkable success in recent years, leading to significant advancements in various fields such as vision, natural language generation, complex game play, as well as solving difficult scientific problems such as predicting protein folding. Despite these successes, traditional deep learning faces fundamental challenges limiting their scalability and effectiveness. These challenges include the necessity for extensive labeled datasets, the lack of trustworthiness due to model overconfidence, and difficulties in generalizing to new, unseen data. In this thesis, our primary goal is to tackle these issues by introducing novel tools and methods that augment traditional deep learning. We explore various strategies for solving the main bottlenecks of traditional deep learning, which includes incorporating prior known symmetries and inductive biases of the problem, utilizing Bayesian and ensemble methods, and leveraging abundance of unlabeled data in a representation learning framework. We discuss and demonstrate practical applications of these novel tools in diverse domains including vision, photonics, material science and neuroscience.
Date issued
2024-05Department
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer SciencePublisher
Massachusetts Institute of Technology