Essays on Algorithmic Learning and Uncertainty Quantification
Author(s)
Vijaykumar, Suhas
DownloadThesis PDF (1.485Mb)
Advisor
Chernozhukov, Victor
Mikusheva, Anna
Terms of use
Metadata
Show full item recordAbstract
The thesis consists of three essays. The first, titled “Localization, Convexity, and Star Aggregation,” develops new analytical tools based upon the offset Rademacher complexity for studying stochastic optimization in non-convex domains, including statistical prediction and model aggregation problems. Using these tools, I show that a simple procedure called the star algorithm can recover near-optimal convergence rates for non-parametric logistic regression in non-convex models.
The second essay, titled “Kernel Ridge Regression Inference,” introduces a new technique for deriving sharp, non-asymptotic, uniform Gaussian approximation for partial sums in a reproducing kernel Hilbert space, which is then applied to construct uniform confidence bands for the widely-used kernel ridge regression algorithm.
The third and final essay, titled “Frank-Wolfe Meets Metric Entropy,” uses ideas from asymptotic geometry to derive new dimension-dependent and domain-specific lower bounds for conditional gradient algorithms, a class of optimization procedures including the popular Frank-Wolfe algorithm and many of its variants. Such algorithms have found extensive use in machine learning and high-dimensional statistics, motivating a more thorough analysis of their limitations in high-dimensional problems.
Date issued
2023-06Department
Massachusetts Institute of Technology. Department of EconomicsPublisher
Massachusetts Institute of Technology