Skip to yearly menu bar Skip to main content


Poster

Batch and match: black-box variational inference with a score-based divergence

Diana Cai · Chirag Modi · Loucas Pillaud-Vivien · Charles Margossian · Robert Gower · David Blei · Lawrence Saul


Abstract:

Black-box variational inference (BBVI) has enabled flexible and automated inference, and advances in automatic differentiation have led to wide adoption of these methods in software packages. However, classical approaches to BBVI, which optimize a stochastic evidence lower bound (ELBO), often converge slowly due to high variance gradient estimates. In this work, we introduce a novel score-based divergence and propose a score-based variational inference objective. Within this framework, we show that the variational updates can be computed in closed form for Gaussian variational families. In addition, we prove that when the target distribution is Gaussian, the variational parameter updates converge exponentially to the target mean and covariance in the limit of an infinite batch size. Finally, we evaluate the performance of our approach on Gaussian and non-Gaussian target distributions, with several applications to posterior inference. We find that our method typically converges in fewer gradient evaluations than leading implementations of BBVI based on ELBO maximization.

Live content is unavailable. Log in and register to view live content