Open Access
2017 Posterior concentration rates for mixtures of normals in random design regression
Zacharie Naulet, Judith Rousseau
Electron. J. Statist. 11(2): 4065-4102 (2017). DOI: 10.1214/17-EJS1344

Abstract

Previous works on location and location-scale mixtures of normals have shown different upper bounds on the posterior rates of contraction, either in a density estimation context or in nonlinear regression. In both cases, the observations were assumed not too spread by considering either the true density has light tails or the regression function has compact support. It has been conjectured that in a situation where the data are diffuse, location-scale mixtures may benefit from allowing a spatially varying order of approximation. Here we test the argument on the mean regression with normal errors and random design model. Although we cannot invalidate the conjecture due to the lack of lower bound, we find slower upper bounds for location-scale mixtures, even under heavy tails assumptions on the design distribution. However, the proofs suggest to introduce hybrid location-scale mixtures for which faster upper bounds are derived. Finally, we show that all tails assumptions on the design distribution can be released at the price of making the prior distribution covariate dependent.

Citation

Download Citation

Zacharie Naulet. Judith Rousseau. "Posterior concentration rates for mixtures of normals in random design regression." Electron. J. Statist. 11 (2) 4065 - 4102, 2017. https://doi.org/10.1214/17-EJS1344

Information

Received: 1 August 2016; Published: 2017
First available in Project Euclid: 24 October 2017

zbMATH: 1380.62210
MathSciNet: MR3715822
Digital Object Identifier: 10.1214/17-EJS1344

Subjects:
Primary: 62G20
Secondary: 62G08

Keywords: adaptive estimation , Bayesian nonparametric estimation , heavy tails , Hölder class , mixture prior , Nonparametric regression , rate of contraction

Vol.11 • No. 2 • 2017
Back to Top