## Electronic Journal of Statistics

### Normalizing constants of log-concave densities

#### Abstract

We derive explicit bounds for the computation of normalizing constants $Z$ for log-concave densities $\pi =\mathrm{e}^{-U}/Z$ w.r.t. the Lebesgue measure on $\mathbb{R}^{d}$. Our approach relies on a Gaussian annealing combined with recent and precise bounds on the Unadjusted Langevin Algorithm [15]. Polynomial bounds in the dimension $d$ are obtained with an exponent that depends on the assumptions made on $U$. The algorithm also provides a theoretically grounded choice of the annealing sequence of variances. A numerical experiment supports our findings. Results of independent interest on the mean squared error of the empirical average of locally Lipschitz functions are established.

#### Article information

Source
Electron. J. Statist., Volume 12, Number 1 (2018), 851-889.

Dates
First available in Project Euclid: 5 March 2018

https://projecteuclid.org/euclid.ejs/1520240451

Digital Object Identifier
doi:10.1214/18-EJS1411

Mathematical Reviews number (MathSciNet)
MR3770890

Zentralblatt MATH identifier
06864479

#### Citation

Brosse, Nicolas; Durmus, Alain; Moulines, Éric. Normalizing constants of log-concave densities. Electron. J. Statist. 12 (2018), no. 1, 851--889. doi:10.1214/18-EJS1411. https://projecteuclid.org/euclid.ejs/1520240451

#### References

• [1] C. Andrieu, J. Ridgway, and N. Whiteley. Sampling normalizing constants in high dimensions using inhomogeneous diffusions., ArXiv e-prints, Dec. 2016.
• [2] D. Ardia, N. Baştürk, L. Hoogerheide, and H. K. Van Dijk. A comparative study of Monte Carlo methods for efficient evaluation of marginal likelihood., Computational Statistics & Data Analysis, 56(11) :3398–3414, 2012.
• [3] R. Balian., From microphysics to macrophysics: methods and applications of statistical physics, volume 1. Springer Science & Business Media, 2007.
• [4] G. Behrens, N. Friel, and M. Hurn. Tuning tempered transitions., Statistics and Computing, 22(1):65–78, 2012. URL http://dx.doi.org/10.1007/s11222-010-9206-z.
• [5] A. Beskos, D. O. Crisan, A. Jasra, and N. Whiteley. Error bounds and normalising constants for sequential Monte Carlo samplers in high dimensions., Adv. in Appl. Probab., 46(1):279–306, 03 2014. URL http://dx.doi.org/10.1239/aap/1396360114.
• [6] S. Boucheron, G. Lugosi, and P. Massart., Concentration inequalities: a nonasymptotic theory of independence. Oxford university press, 2013.
• [7] S. Brazitikos, A. Giannopoulos, P. Valettas, and B.-H. Vritsiou., Geometry of isotropic convex bodies, volume 196. American Mathematical Society Providence, 2014.
• [8] N. Brosse, A. Durmus, and E. Moulines. Supplement to “Normalizing constants of log-concave densities”. DOI:, 10.1214/18-EJS1411SUPP, 2018.
• [9] M. Chen, Q. Shao, and J. Ibrahim., Monte Carlo methods in Bayesian computation. Springer, New York, 2000.
• [10] B. Cousins and S. Vempala. Bypassing KLS: Gaussian cooling and an $\textO^*(n^3)$ volume algorithm. In, Proceedings of the Forty-Seventh Annual ACM on Symposium on Theory of Computing, pages 539–548. ACM, 2015.
• [11] A. S. Dalalyan. Theoretical guarantees for approximate sampling from smooth and log-concave densities., Journal of the Royal Statistical Society: Series B (Statistical Methodology), 2016.
• [12] P. Del Moral., Feynman-Kac formulae. Probability and its Applications (New York). Springer-Verlag, New York, 2004. ISBN 0-387-20268-4. URL https://doi.org/10.1007/978-1-4684-9393-1. Genealogical and interacting particle systems with applications.
• [13] P. Del Moral, A. Jasra, K. Law, and Y. Zhou. Multilevel sequential Monte Carlo samplers for normalizing constants., ArXiv e-prints, Mar. 2016.
• [14] A. Durmus and E. Moulines. Non-asymptotic convergence analysis for the unadjusted Langevin algorithm. July, 2015.
• [15] A. Durmus and E. Moulines. High-dimensional Bayesian inference via the unadjusted Langevin algorithm. May, 2016.
• [16] R. Dutta, J. K. Ghosh, et al. Bayes model selection with path sampling: factor models and other examples., Statistical Science, 28(1):95–115, 2013.
• [17] M. Dyer and A. Frieze. Computing the volume of convex bodies: a case where randomness provably helps., Probabilistic combinatorics and its applications, 44:123–170, 1991.
• [18] D. L. Ermak. A computer simulation of charged particles in solution. i. technique and equilibrium properties., The Journal of Chemical Physics, 62(10) :4189–4196, 1975.
• [19] L. C. Evans and R. F. Gariepy., Measure theory and fine properties of functions. CRC press, 2015.
• [20] N. Friel and J. Wyse. Estimating the evidence–a review., Statistica Neerlandica, 66(3):288–308, 2012.
• [21] N. Friel, M. Hurn, and J. Wyse. Improving power posterior estimation of statistical evidence., Statistics and Computing, 24(5):709–723, 2014. ISSN 1573-1375. URL http://dx.doi.org/10.1007/s11222-013-9397-1.
• [22] A. Gelman and X.-L. Meng. Simulating normalizing constants: From importance sampling to bridge sampling to path sampling., Statistical science, pages 163–185, 1998.
• [23] M. Huber. Approximation algorithms for the normalizing constant of Gibbs distributions., Ann. Appl. Probab., 25(2):974–985, 04 2015. URL http://dx.doi.org/10.1214/14-AAP1015.
• [24] C. Jarzynski. Equilibrium free-energy differences from nonequilibrium measurements: A master-equation approach., Physical Review E, 56(5) :5018, 1997.
• [25] A. Jasra, C. C. Holmes, and D. A. Stephens. Markov chain Monte Carlo methods and the label switching problem in Bayesian mixture modeling., Statist. Sci., 20(1):50–67, 02 2005. URL https://doi.org/10.1214/088342305000000016.
• [26] A. Jasra, K. Kamatani, P. P. Osei, and Y. Zhou. Multilevel particle filters: normalizing constant estimation., Statistics and Computing, pages 1–14, 2016. ISSN 1573-1375. URL http://dx.doi.org/10.1007/s11222-016-9715-5.
• [27] M. R. Jerrum, L. G. Valiant, and V. V. Vazirani. Random generation of combinatorial structures from a uniform distribution., Theoretical Computer Science, 43:169–188, 1986. ISSN 0304-3975. URL http://www.sciencedirect.com/science/article/pii/030439758690174X.
• [28] A. Joulin and Y. Ollivier. Curvature, concentration and error estimates for Markov chain Monte Carlo., Ann. Probab., 38(6) :2418–2442, 11 2010. URL http://dx.doi.org/10.1214/10-AOP541.
• [29] K. H. Knuth, M. Habeck, N. K. Malakar, A. M. Mubeen, and B. Placek. Bayesian evidence and model selection., Digital Signal Processing, 47:50–67, 2015. ISSN 1051-2004. URL http://www.sciencedirect.com/science/article/pii/S1051200415001980. Special Issue in Honour of William J. (Bill) Fitzgerald.
• [30] T. Lelièvre, G. Stoltz, and M. Rousset., Free energy computations: A mathematical perspective. World Scientific, 2010.
• [31] J.-M. Marin and C. P. Robert. Importance sampling methods for Bayesian discrimination between embedded models., arXiv preprint arXiv :0910.2325, 2009.
• [32] S. P. Meyn and R. L. Tweedie. Stability of Markovian processes iii: Foster-Lyapunov criteria for continuous-time processes., Advances in Applied Probability, pages 518–548, 1993.
• [33] P. D. Moral, A. Doucet, and A. Jasra. Sequential Monte Carlo samplers., Journal of the Royal Statistical Society. Series B (Statistical Methodology), 68(3):411–436, 2006. ISSN 13697412, 14679868. URL http://www.jstor.org/stable/3879283.
• [34] R. M. Neal. Annealed importance sampling., Statistics and Computing, 11(2):125–139, 2001. ISSN 1573-1375. URL http://dx.doi.org/10.1023/A:1008923215028.
• [35] Y. Nesterov., Introductory lectures on convex optimization: A basic course, volume 87. Springer Science & Business Media, 2013.
• [36] W. Niemiro and P. Pokarowski. Fixed precision MCMC estimation by median of products of averages., J. Appl. Probab., 46(2):309–329, 06 2009. URL http://dx.doi.org/10.1239/jap/1245676089.
• [37] C. J. Oates, T. Papamarkou, and M. Girolami. The controlled thermodynamic integral for Bayesian model evidence evaluation., Journal of the American Statistical Association, 111(514):634–645, 2016. URL http://dx.doi.org/10.1080/01621459.2015.1021006.
• [38] G. Parisi. Correlation functions and computer simulations., Nuclear Physics B, 180:378–384, 1981.
• [39] M. Pereyra. Maximum-a-posteriori estimation with Bayesian confidence regions., arXiv preprint arXiv :1602.08590, 2016.
• [40] F. Proschan and J. Sethuraman. Stochastic comparisons of order statistics from heterogeneous populations, with applications in reliability., Journal of Multivariate Analysis, 6(4):608–616, 1976. ISSN 0047-259X. URL http://www.sciencedirect.com/science/article/pii/0047259X76900087.
• [41] R Core Team., R: a language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria, 2018. URL https://www.R-project.org/.
• [42] S. Richardson and P. J. Green. On Bayesian analysis of mixtures with an unknown number of components (with discussion)., Journal of the Royal Statistical Society: Series B (Statistical Methodology), 59(4):731–792, 1997. ISSN 1467-9868. URL http://dx.doi.org/10.1111/1467-9868.00095.
• [43] C. Robert., The Bayesian choice: from decision-theoretic foundations to computational implementation. Springer Science & Business Media, 2007.
• [44] G. O. Roberts and R. L. Tweedie. Exponential convergence of Langevin distributions and their discrete approximations., Bernoulli, 2(4):341–363, 1996. ISSN 1350-7265. URL http://dx.doi.org/10.2307/3318418.
• [45] J. P. Valleau and D. N. Card. Monte Carlo estimation of the free energy by multistage sampling., The Journal of Chemical Physics, 57(12) :5457–5462, 1972. URL http://dx.doi.org/10.1063/1.1678245.
• [46] C. Villani., Optimal transport: old and new, volume 338. Springer Science & Business Media, 2008.
• [47] E. J. Williams and E. Williams., Regression analysis, volume 14. wiley New York, 1959.
• [48] J. Wyse. Estimating the statistical evidence - a review, 2011. URL, https://sites.google.com/site/jsnwyse/code.
• [49] Y. Zhou, A. M. Johansen, and J. A. Aston. Towards automatic model comparison: an adaptive sequential Monte Carlo approach., Journal of Computational and Graphical Statistics, (just-accepted), 2015.