Bayesian coreset construction via greedy iterative geodesic ascent
Author(s)Campbell, Trevor David; Broderick, Tamara A
MetadataShow full item record
Coherent uncertainty quantification is a key strength of Bayesian methods. But modern algorithms for approximate Bayesian posterior inference often sacrifice accurate posterior uncertainty estimation in the pursuit of scalability. This work shows that previous Bayesian coreset construction algorithms-which build a small, weighted subset of the data that approximates the full dataset-are no exception. We demonstrate that these algorithms scale the coreset log-likelihood suboptimally, resulting in underestimated posterior uncertainty. To address this shortcoming, we develop greedy iterative geodesic ascent (GIGA), a novel algorithm for Bayesian coreset construction that scales the coreset log-likelihood optimally. GIGA provides geometric decay in posterior approximation error as a function of coreset size, and maintains the fast running time of its predecessors. The paper concludes with validation of GIGA on both synthetic and real datasets, demonstrating that it reduces posterior approximation error by orders of magnitude compared with previous coreset constructions.
DepartmentMassachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory; Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
Proceedings of the 35th International Conference on Machine Learning, PMLR
Campbell, Trevor and Tamara Broderick. “Bayesian coreset construction via greedy iterative geodesic ascent.” Proceedings of the 35th International Conference on Machine Learning, PMLR, 80 (July 2018): 698-706 © 2018 The Author(s)
Author's final manuscript