MIT Libraries logoDSpace@MIT

MIT
View Item 
  • DSpace@MIT Home
  • MIT Libraries
  • MIT Theses
  • Doctoral Theses
  • View Item
  • DSpace@MIT Home
  • MIT Libraries
  • MIT Theses
  • Doctoral Theses
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Smoothed Online Learning: Theory and Applications

Author(s)
Block, Adam B.
Thumbnail
DownloadThesis PDF (2.271Mb)
Advisor
Rakhlin, Alexander
Terms of use
Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) Copyright retained by author(s) https://creativecommons.org/licenses/by-nc-nd/4.0/
Metadata
Show full item record
Abstract
Many of the algorithms and theoretical results surrounding modern machine learning are predicated on the assumption that data are independent and identically distributed. Motivated by the numerous applications that do not satisfy this assumption, many researchers have been interested in relaxations of this condition, with online learning being the weakest such assumption. In this setting, the learner observes data points one at a time and makes predictions, before incorporating the data into a training set with the goal of predicting new data points as well as possible. Due to the lack of assumptions on the data, this setting is both computationally and statistically challenging. In this thesis, we investigate the statistical rates and efficient algorithms achievable when the data are constrained in a natural way motivated by the smoothed analysis of algorithms. The first part covers the statistical rates achievable by an arbitrary algorithm without regard to efficiency, covering both the fully adversarial setting and the constrained setting in which improved rates are possible. The second part of the thesis focuses on efficient algorithms for this constrained setting, as well as special cases where bounds can be improved under additional structure. Finally, in the third part we investigate applications of these techniques to sequential decision making, robotics, and differential privacy. We introduce a number of novel techniques, including a Gaussian anti-concentration inequality and a new norm comparison for dependent data.
Date issued
2024-05
URI
https://hdl.handle.net/1721.1/155382
Department
Massachusetts Institute of Technology. Department of Mathematics
Publisher
Massachusetts Institute of Technology

Collections
  • Doctoral Theses

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

My Account

Login

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries
PrivacyPermissionsAccessibilityContact us
MIT
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.