Scalable Gaussian process inference with finite-data mean and variance guarantees
Author(s)
Huggins, Jonathan H.; Broderick, Tamara A; Campbell, Trevor David
DownloadAccepted version (4.550Mb)
Open Access Policy
Open Access Policy
Creative Commons Attribution-Noncommercial-Share Alike
Terms of use
Metadata
Show full item recordAbstract
Gaussian processes (GPs) offer a flexible class of priors for nonparametric Bayesian regression, but popular GP posterior inference methods are typically prohibitively slow or lack desirable finite-data guarantees on quality. We develop a scalable approach to approximate GP regression, with finite-data guarantees on the accuracy of our pointwise posterior mean and variance estimates. Our main contribution is a novel objective for approximate inference in the nonparametric setting: the preconditioned Fisher (pF) divergence. We show that unlike the Kullback-Leibler divergence (used in variational inference), the pF divergence bounds the 2-Wasserstein distance, which in turn provides tight bounds on the pointwise error of mean and variance estimates. We demonstrate that, for sparse GP likelihood approximations, we can minimize the pF divergence efficiently. Our experiments show that optimizing the pF divergence has the same computational requirements as variational sparse GPs while providing comparable empirical performance-in addition to our novel finite-data quality guarantees.
Date issued
2019-04Department
Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science; Massachusetts Institute of Technology. Computer Science and Artificial Intelligence LaboratoryJournal
Proceedings of the 22ndInternational Conference on Ar-tificial Intelligence and Statistics (AISTATS)
Publisher
PMLR
Citation
Huggins, Jonathan H. et al. “Scalable Gaussian process inference with finite-data mean and variance guarantees.” Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics (AISTATS), 89 (April 2019): 76-86 © 2019 The Author(s)
Version: Author's final manuscript
ISSN
1938-7228