The Annals of Statistics

A Risk Bound in Sobolev Class Regression

Grigori K. Golubev and Michael Nussbaum

Full-text: Open access

Abstract

For nonparametric regression estimation, when the unknown function belongs to a Sobolev smoothness class, sharp risk bounds for integrated mean square error have been found recently which improve on optimal rates of convergence results. The key to these has been the fact that under normality of the errors, the minimax linear estimator is asymptotically minimax in the class of all estimators. We extend this result to the nonnormal case, when the noise distribution is unknown. The pertaining lower asymptotic risk bound is established, based on an analogy with a location model in the independent identically distributed case. Attainment of the bound and its relation to adaptive optimal smoothing are discussed.

Article information

Source
Ann. Statist., Volume 18, Number 2 (1990), 758-778.

Dates
First available in Project Euclid: 12 April 2007

Permanent link to this document
https://projecteuclid.org/euclid.aos/1176347624

Digital Object Identifier
doi:10.1214/aos/1176347624

Mathematical Reviews number (MathSciNet)
MR1056335

Zentralblatt MATH identifier
0713.62047

JSTOR
links.jstor.org

Subjects
Primary: 62G20: Asymptotic properties
Secondary: 62G05: Estimation 62C20: Minimax procedures

Keywords
Nonparametric regression asymptotic minimax $L_2$ risk smoothness ellipsoid location model shrinking Hellinger neighborhoods adaptive bandwidth choice experimental design robust smoothing

Citation

Golubev, Grigori K.; Nussbaum, Michael. A Risk Bound in Sobolev Class Regression. Ann. Statist. 18 (1990), no. 2, 758--778. doi:10.1214/aos/1176347624. https://projecteuclid.org/euclid.aos/1176347624


Export citation