MIT Libraries logoDSpace@MIT

MIT
View Item 
  • DSpace@MIT Home
  • MIT Libraries
  • MIT Theses
  • Doctoral Theses
  • View Item
  • DSpace@MIT Home
  • MIT Libraries
  • MIT Theses
  • Doctoral Theses
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Statistical Learning with Discrete Structures: Statistical and Computational Perspectives

Author(s)
Behdin, Kayhan
Thumbnail
DownloadThesis PDF (3.417Mb)
Advisor
Mazumder, Rahul
Terms of use
In Copyright - Educational Use Permitted Copyright retained by author(s) https://rightsstatements.org/page/InC-EDU/1.0/
Metadata
Show full item record
Abstract
In various statistical tasks it is of interest to learn estimators with discrete structures (e.g., sparsity, low-rank, shared model parameters, etc)---they are appealing for their interpretability and compactness. However, learning with discrete structures can be computationally challenging. In this thesis, we explore statistical and computational aspects of statistics estimators (some classical and some new) that can be formulated as discrete optimization problems. In Chapters 2 and 3, we study two well-known problems in high-dimensional statistics: sparse Principal Component Analysis (PCA) and Gaussian Graphical Models. These are notoriously hard optimization problems---we explore computationally friendlier estimators based on Mixed Integer Programming (MIP) under suitable statistical assumptions. We study the statistical and computational properties of our estimators. In the fourth chapter, we study the multi-task learning problem with sparse linear estimators. Motivated by applications in biomedical sciences, we propose a new modeling framework to jointly learn sparse linear estimators for different tasks by sharing support information. Our theoretical results show that our joint estimation framework can lead to better statistical properties compared to independently fitting models for each task. We develop scalable approximate solvers for our MIP-based formulation. In the fifth chapter, we study the problem of sparse Nonnegative Matrix Factorization (NMF) using Cutler and Breiman's archetypal regularization. We explore the utility of our methods in the context of applications in the biomedical sciences, computer vision and computational finance.
Date issued
2024-05
URI
https://hdl.handle.net/1721.1/155499
Department
Massachusetts Institute of Technology. Operations Research Center; Sloan School of Management
Publisher
Massachusetts Institute of Technology

Collections
  • Doctoral Theses

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

My Account

Login

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries
PrivacyPermissionsAccessibilityContact us
MIT
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.