Bayesian coreset construction via greedy iterative geodesic ascent
Author(s)
Campbell, Trevor David; Broderick, Tamara A
DownloadAccepted version (1.555Mb)
Open Access Policy
Open Access Policy
Creative Commons Attribution-Noncommercial-Share Alike
Terms of use
Metadata
Show full item recordAbstract
Coherent uncertainty quantification is a key strength of Bayesian methods. But modern algorithms for approximate Bayesian posterior inference often sacrifice accurate posterior uncertainty estimation in the pursuit of scalability. This work shows that previous Bayesian coreset construction algorithms-which build a small, weighted subset of the data that approximates the full dataset-are no exception. We demonstrate that these algorithms scale the coreset log-likelihood suboptimally, resulting in underestimated posterior uncertainty. To address this shortcoming, we develop greedy iterative geodesic ascent (GIGA), a novel algorithm for Bayesian coreset construction that scales the coreset log-likelihood optimally. GIGA provides geometric decay in posterior approximation error as a function of coreset size, and maintains the fast running time of its predecessors. The paper concludes with validation of GIGA on both synthetic and real datasets, demonstrating that it reduces posterior approximation error by orders of magnitude compared with previous coreset constructions.
Date issued
2018-07Department
Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory; Massachusetts Institute of Technology. Department of Electrical Engineering and Computer ScienceJournal
Proceedings of the 35th International Conference on Machine Learning, PMLR
Publisher
MIT Press
Citation
Campbell, Trevor and Tamara Broderick. “Bayesian coreset construction via greedy iterative geodesic ascent.” Proceedings of the 35th International Conference on Machine Learning, PMLR, 80 (July 2018): 698-706 © 2018 The Author(s)
Version: Author's final manuscript
ISSN
1533-7928
1532-4435