Streaming, distributed variational inference for Bayesian nonparametrics
Author(s)Campbell, Trevor David; Straub, Julian; Fisher, John W; How, Jonathan P
MetadataShow full item record
This paper presents a methodology for creating streaming, distributed inference algorithms for Bayesian nonparametric (BNP) models. In the proposed framework, processing nodes receive a sequence of data minibatches, compute a variational posterior for each, and make asynchronous streaming updates to a central model. In contrast to previous algorithms, the proposed framework is truly streaming, distributed, asynchronous, learning-rate-free, and truncation-free. The key challenge in developing the framework, arising from fact that BNP models do not impose an inherent ordering on their components, is finding the correspondence between minibatch and central BNP posterior components before performing each update. To address this, the paper develops a combinatorial optimization problem over component correspondences, and provides an efficient solution technique. The paper concludes with an application of the methodology to the DP mixture model, with experimental results demonstrating its practical scalability and performance.
DepartmentMassachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratory; Massachusetts Institute of Technology. Department of Aeronautics and Astronautics; Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
Advances in Neural Information Processing Systems (NIPS 2015)
Neural Information Processing Systems Foundation
Campell, Trevor et al. "Streaming, Distributed Variational Inference for Bayesian Nonparametrics" Advances in Neural Information Processing Systems (NIPS 2015).
Final published version