Notice

This is not the latest version of this item. The latest version can be found at:https://dspace.mit.edu/handle/1721.1/129343.2

Show simple item record

dc.contributor.authorHoang, Trong Nghia
dc.contributor.authorJaillet, Patrick
dc.date.accessioned2021-01-08T15:11:59Z
dc.date.available2021-01-08T15:11:59Z
dc.date.submitted2019-03
dc.identifier.isbn9781728119861
dc.identifier.urihttps://hdl.handle.net/1721.1/129343
dc.description.abstractThis paper presents a novel variational inference framework for deriving a family of Bayesian sparse Gaussian process regression (SGPR) models whose approximations are variationally optimal with respect to the full-rank GPR model enriched with various corresponding correlation structures of the observation noises. Our variational Bayesian SGPR (VBSGPR) models jointly treat both the distributions of the inducing variables and hyperparameters as variational parameters, which enables the decomposability of the variational lower bound that in turn can be exploited for stochastic optimization. Such a stochastic optimization involves iteratively following the stochastic gradient of the variational lower bound to improve its estimates of the optimal variational distributions of the inducing variables and hyperparameters (and hence the predictive distribution) of our VBSGPR models and is guaranteed to achieve asymptotic convergence to them. We show that the stochastic gradient is an unbiased estimator of the exact gradient and can be computed in constant time per iteration, hence achieving scalability to big data. We empirically evaluate the performance of our proposed framework on two real-world, massive datasets.en_US
dc.language.isoen
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)en_US
dc.relation.isversionof10.1109/IJCNN.2019.8852481en_US
dc.rightsCreative Commons Attribution-Noncommercial-Share Alikeen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/4.0/en_US
dc.sourcearXiven_US
dc.titleStochastic Variational Inference for Bayesian Sparse Gaussian Process Regressionen_US
dc.typeArticleen_US
dc.identifier.citationYu, Haibin, Trong Nghia Hoang, Kian Hsiang Low, Patrick Jaillet. “Stochastic Variational Inference for Bayesian Sparse Gaussian Process Regression.” Proceedings of the International Joint Conference on Neural Networks (IJCNN'19), 1 (July 2019) © 2019 The Author(s)en_US
dc.contributor.departmentMassachusetts Institute of Technology. Laboratory for Information and Decision Systemsen_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Scienceen_US
dc.relation.journalProceedings of the International Joint Conference on Neural Networks (IJCNN'19)en_US
dc.eprint.versionOriginal manuscripten_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
eprint.statushttp://purl.org/eprint/status/NonPeerRevieweden_US
dc.date.updated2020-12-21T18:16:00Z
dspace.orderedauthorsYu, H; Nghia, T; Hsiang Low, BK; Jaillet, Pen_US
dspace.date.submission2020-12-21T18:16:03Z
mit.journal.volume2019-Julyen_US
mit.licenseOPEN_ACCESS_POLICY


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record

VersionItemDateSummary

*Selected version