Show simple item record

dc.contributor.authorBecker, McCoy R.
dc.contributor.authorLew, Alexander K.
dc.contributor.authorWang, Xiaoyan
dc.contributor.authorGhavami, Matin
dc.contributor.authorHuot, Mathieu
dc.contributor.authorRinard, Martin C.
dc.contributor.authorMansinghka, Vikash K.
dc.date.accessioned2024-07-09T15:51:27Z
dc.date.available2024-07-09T15:51:27Z
dc.date.issued2024-06-20
dc.identifier.issn2475-1421
dc.identifier.urihttps://hdl.handle.net/1721.1/155517
dc.description.abstractCompared to the wide array of advanced Monte Carlo methods supported by modern probabilistic programming languages (PPLs), PPL support for variational inference (VI) is underdeveloped: users are typically limited to a small selection of predefined variational objectives and gradient estimators, which are implemented monolithically (and without explicit correctness arguments) in PPL backends. In this paper, we propose a modular approach to supporting VI in PPLs, based on compositional program transformation. First, we present a probabilistic programming language for defining models, variational families, and compositional strategies for propagating gradients. Second, we present a differentiable programming language for defining variational objectives. Models and variational families from the first language are automatically compiled into new differentiable functions that can be called from the second language, for estimating densities and expectations. Finally, we present an automatic differentiation algorithm that differentiates these variational objectives, yielding provably unbiased gradient estimators for use during optimization. We also extend our source language with features not previously supported for VI in PPLs, including approximate marginalization and normalization. This makes it possible to concisely express many models, variational families, objectives, and gradient estimators from the machine learning literature, including importance-weighted autoencoders (IWAE), hierarchical variational inference (HVI), and reweighted wake-sleep (RWS). We implement our approach in an extension to the Gen probabilistic programming system (genjax.vi, implemented in JAX), and evaluate our automation on several deep generative modeling tasks, showing minimal performance overhead vs. hand-coded implementations and performance competitive to well-established open-source PPLs.en_US
dc.publisherAssociation for Computing Machineryen_US
dc.relation.isversionof10.1145/3656463en_US
dc.rightsCreative Commons Attributionen_US
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/en_US
dc.sourceAssociation for Computing Machineryen_US
dc.titleProbabilistic Programming with Programmable Variational Inferenceen_US
dc.typeArticleen_US
dc.identifier.citationBecker, McCoy R., Lew, Alexander K., Wang, Xiaoyan, Ghavami, Matin, Huot, Mathieu et al. 2024. "Probabilistic Programming with Programmable Variational Inference." Proceedings of the ACM on Programming Languages, 8 (PLDI).
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
dc.contributor.departmentMassachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
dc.relation.journalProceedings of the ACM on Programming Languagesen_US
dc.identifier.mitlicensePUBLISHER_CC
dc.eprint.versionFinal published versionen_US
dc.type.urihttp://purl.org/eprint/type/JournalArticleen_US
eprint.statushttp://purl.org/eprint/status/PeerRevieweden_US
dc.date.updated2024-07-01T07:59:39Z
dc.language.rfc3066en
dc.rights.holderThe author(s)
dspace.date.submission2024-07-01T07:59:40Z
mit.journal.volume8en_US
mit.journal.issuePLDIen_US
mit.licensePUBLISHER_CC
mit.metadata.statusAuthority Work and Publication Information Neededen_US


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record