A Stein variational Newton method
Author(s)
Marzouk, Youssef M; Spantini, Alessio
DownloadPublished version (459.9Kb)
Terms of use
Metadata
Show full item recordAbstract
Stein variational gradient descent (SVGD) was recently proposed as a general purpose nonparametric variational inference algorithm [Liu & Wang, NIPS 2016]: it minimizes the Kullback-Leibler divergence between the target distribution and its approximation by implementing a form of functional gradient descent on a reproducing kernel Hilbert space. In this paper, we accelerate and generalize the SVGD algorithm by including second-order information, thereby approximating a Newton-like iteration in function space. We also show how second-order information can lead to more effective choices of kernel. We observe significant computational gains over the original SVGD algorithm in multiple test cases.
Date issued
2018-12Department
Massachusetts Institute of Technology. Department of Aeronautics and AstronauticsJournal
32nd Conference on Neural Information Processing Systems (NeurIPS 2018)
Publisher
Curran Associates, Inc.
Citation
Detommaso, Gianluca et al. “A Stein variational Newton method.” Paper presented at the 32nd Conference on Neural Information Processing Systems (NeurIPS 2018), Montreal, Canada, 3-8 December 2018, Curran Associates, Inc. © 2018 The Author(s)
Version: Final published version