Show simple item record

dc.contributor.advisorWornell, Gregory W.
dc.contributor.authorLee, Joshua Ka-Wing
dc.date.accessioned2022-02-07T15:20:12Z
dc.date.available2022-02-07T15:20:12Z
dc.date.issued2021-09
dc.date.submitted2021-09-21T19:30:53.208Z
dc.identifier.urihttps://hdl.handle.net/1721.1/140035
dc.description.abstractIn standard supervised learning, we assume that we are trying to learn some target variable 𝑌 from some data 𝑋. However, many learning problems can be framed as supervised learning with an auxiliary objective, often associated with an auxiliary variable 𝐷 which defines this objective. Applying the principles of Hirschfeld-Gebelein-Rényi (HGR) maximal correlation analysis reveals new insights as to how to formulate these learning problems with auxiliary objectives. We examine the use of the HGR in feature selection for multi-source transfer learning learning in the fewshot setting. We then apply HGR to the problem of feature suppression via enforcing marginal and conditional independence criteria with respect to a sensitive attribute, and illustrate the effectiveness of our methods to problems of fairness, privacy, and transfer learning. Finally, we explore the use of HGR in extracting features for outlier detection.
dc.publisherMassachusetts Institute of Technology
dc.rightsIn Copyright - Educational Use Permitted
dc.rightsCopyright MIT
dc.rights.urihttp://rightsstatements.org/page/InC-EDU/1.0/
dc.titleMaximal Correlation Feature Selection and Suppression With Applications
dc.typeThesis
dc.description.degreeSc.D.
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
mit.thesis.degreeDoctoral
thesis.degree.nameDoctor of Science


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record