Scalable Gaussian process inference with finite-data mean and variance guarantees
Author(s)Huggins, Jonathan H.; Broderick, Tamara A; Campbell, Trevor David
MetadataShow full item record
Gaussian processes (GPs) offer a flexible class of priors for nonparametric Bayesian regression, but popular GP posterior inference methods are typically prohibitively slow or lack desirable finite-data guarantees on quality. We develop a scalable approach to approximate GP regression, with finite-data guarantees on the accuracy of our pointwise posterior mean and variance estimates. Our main contribution is a novel objective for approximate inference in the nonparametric setting: the preconditioned Fisher (pF) divergence. We show that unlike the Kullback-Leibler divergence (used in variational inference), the pF divergence bounds the 2-Wasserstein distance, which in turn provides tight bounds on the pointwise error of mean and variance estimates. We demonstrate that, for sparse GP likelihood approximations, we can minimize the pF divergence efficiently. Our experiments show that optimizing the pF divergence has the same computational requirements as variational sparse GPs while providing comparable empirical performance-in addition to our novel finite-data quality guarantees.
DepartmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Science; Massachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory
Proceedings of the 22ndInternational Conference on Ar-tificial Intelligence and Statistics (AISTATS)
Huggins, Jonathan H. et al. “Scalable Gaussian process inference with finite-data mean and variance guarantees.” Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics (AISTATS), 89 (April 2019): 76-86 © 2019 The Author(s)
Author's final manuscript