Open Access
April 2020 $\alpha $-variational inference with statistical guarantees
Yun Yang, Debdeep Pati, Anirban Bhattacharya
Ann. Statist. 48(2): 886-905 (April 2020). DOI: 10.1214/19-AOS1827


We provide statistical guarantees for a family of variational approximations to Bayesian posterior distributions, called $\alpha $-VB, which has close connections with variational approximations of tempered posteriors in the literature. The standard variational approximation is a special case of $\alpha $-VB with $\alpha =1$. When $\alpha \in (0,1]$, a novel class of variational inequalities are developed for linking the Bayes risk under the variational approximation to the objective function in the variational optimization problem, implying that maximizing the evidence lower bound in variational inference has the effect of minimizing the Bayes risk within the variational density family. Operating in a frequentist setup, the variational inequalities imply that point estimates constructed from the $\alpha $-VB procedure converge at an optimal rate to the true parameter in a wide range of problems. We illustrate our general theory with a number of examples, including the mean-field variational approximation to (low)-high-dimensional Bayesian linear regression with spike and slab priors, Gaussian mixture models and latent Dirichlet allocation.


Download Citation

Yun Yang. Debdeep Pati. Anirban Bhattacharya. "$\alpha $-variational inference with statistical guarantees." Ann. Statist. 48 (2) 886 - 905, April 2020.


Received: 1 January 2018; Revised: 1 January 2019; Published: April 2020
First available in Project Euclid: 26 May 2020

zbMATH: 07241573
MathSciNet: MR4102680
Digital Object Identifier: 10.1214/19-AOS1827

Primary: 62G07 , 62G20
Secondary: 60K35

Keywords: Bayes risk , evidence lower bound , latent variable models , Rényi divergence , variational inference

Rights: Copyright © 2020 Institute of Mathematical Statistics

Vol.48 • No. 2 • April 2020
Back to Top