Show simple item record

dc.contributor.advisorRigollet, Philippe
dc.contributor.authorChewi, Sinho
dc.date.accessioned2023-07-31T19:32:02Z
dc.date.available2023-07-31T19:32:02Z
dc.date.issued2023-06
dc.date.submitted2023-05-24T14:46:43.264Z
dc.identifier.urihttps://hdl.handle.net/1721.1/151333
dc.description.abstractThe primary contribution of this thesis is to advance the theory of complexity for sampling from a continuous probability density over R^d. Some highlights include: a new analysis of the proximal sampler, taking inspiration from the proximal point algorithm in optimization; an improved and sharp analysis of the Metropolis-adjusted Langevin algorithm, yielding new state-of-the-art guarantees for high-accuracy log-concave sampling; the first lower bounds for the complexity of log-concave sampling; an analysis of mirror Langevin Monte Carlo for constrained sampling; and the development of a theory of approximate first-order stationarity in non-log-concave sampling. We further illustrate the main tools in this work—diffusions and Wasserstein gradient flows—through applications to functional inequalities, the entropic barrier, Wasserstein barycenters, variational inference, and diffusion models.
dc.publisherMassachusetts Institute of Technology
dc.rightsAttribution-ShareAlike 4.0 International (CC BY-SA 4.0)
dc.rightsCopyright retained by author(s)
dc.rights.urihttps://creativecommons.org/licenses/by-sa/4.0/
dc.titleAn optimization perspective on log-concave sampling and beyond
dc.typeThesis
dc.description.degreePh.D.
dc.contributor.departmentMassachusetts Institute of Technology. Department of Mathematics
dc.identifier.orcid0000-0003-2701-0703
mit.thesis.degreeDoctoral
thesis.degree.nameDoctor of Philosophy


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record