dc.contributor.advisor | Jaakkola, Tommi S. | |
dc.contributor.author | Schechter, Amit | |
dc.date.accessioned | 2025-03-12T16:55:24Z | |
dc.date.available | 2025-03-12T16:55:24Z | |
dc.date.issued | 2024-09 | |
dc.date.submitted | 2025-03-04T18:46:01.561Z | |
dc.identifier.uri | https://hdl.handle.net/1721.1/158491 | |
dc.description.abstract | We propose two methods for improving subgroup robustness and out of distribution generalization of machine learning models. First we introduce a formulation of Group DRO with soft group assignment. This formulation can be applied to data with noisy or uncertain group labels, or when only a small subset of the training data has group labels. We propose a modified loss function, explain how to apply it to data with noisy group labels as well as data with missing or few group labels, and perform experiments to demonstrate its effectiveness. In the second part, we propose an invariant decision tree objective that aims to improve the robustness of tree-based models and address a common failure mode of existing methods for out-of-domain generalization. We demonstrate the benefits of this method both theoretically and empirically. Both these approaches are designed to enhance machine learning models’ performance under distribution shift. | |
dc.publisher | Massachusetts Institute of Technology | |
dc.rights | In Copyright - Educational Use Permitted | |
dc.rights | Copyright retained by author(s) | |
dc.rights.uri | https://rightsstatements.org/page/InC-EDU/1.0/ | |
dc.title | Methods for Enhancing Robustness and Generalization in Machine Learning | |
dc.type | Thesis | |
dc.description.degree | S.M. | |
dc.contributor.department | Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science | |
mit.thesis.degree | Master | |
thesis.degree.name | Master of Science in Electrical Engineering and Computer Science | |