Electronic Journal of Statistics
- Electron. J. Statist.
- Volume 9, Number 1 (2015), 1-79.
Brittleness of Bayesian inference under finite information in a continuous world
We derive, in the classical framework of Bayesian sensitivity analysis, optimal lower and upper bounds on posterior values obtained from Bayesian models that exactly capture an arbitrarily large number of finite-dimensional marginals of the data-generating distribution and/or that are as close as desired to the data-generating distribution in the Prokhorov or total variation metrics; these bounds show that such models may still make the largest possible prediction error after conditioning on an arbitrarily large number of sample data measured at finite precision. These results are obtained through the development of a reduction calculus for optimization problems over measures on spaces of measures. We use this calculus to investigate the mechanisms that generate brittleness/robustness and, in particular, we observe that learning and robustness are antagonistic properties. It is now well understood that the numerical resolution of PDEs requires the satisfaction of specific stability conditions. Is there a missing stability condition for using Bayesian inference in a continuous world under finite information?
Electron. J. Statist., Volume 9, Number 1 (2015), 1-79.
First available in Project Euclid: 2 February 2015
Permanent link to this document
Digital Object Identifier
Mathematical Reviews number (MathSciNet)
Zentralblatt MATH identifier
Primary: 62F15: Bayesian inference 62G35: Robustness
Secondary: 62A01: Foundations and philosophical topics 62E20: Asymptotic distribution theory 62F12: Asymptotic properties of estimators 62G20: Asymptotic properties
Owhadi, Houman; Scovel, Clint; Sullivan, Tim. Brittleness of Bayesian inference under finite information in a continuous world. Electron. J. Statist. 9 (2015), no. 1, 1--79. doi:10.1214/15-EJS989. https://projecteuclid.org/euclid.ejs/1422885673