Open Access
December, 1982 Optimal Global Rates of Convergence for Nonparametric Regression
Charles J. Stone
Ann. Statist. 10(4): 1040-1053 (December, 1982). DOI: 10.1214/aos/1176345969

Abstract

Consider a $p$-times differentiable unknown regression function $\theta$ of a $d$-dimensional measurement variable. Let $T(\theta)$ denote a derivative of $\theta$ of order $m$ and set $r = (p - m)/(2p + d)$. Let $\hat{T}_n$ denote an estimator of $T(\theta)$ based on a training sample of size $n$, and let $\| \hat{T}_n - T(\theta)\|_q$ be the usual $L^q$ norm of the restriction of $\hat{T}_n - T(\theta)$ to a fixed compact set. Under appropriate regularity conditions, it is shown that the optimal rate of convergence for $\| \hat{T}_n - T(\theta)\|_q$ is $n^{-r}$ if $0 < q < \infty$; while $(n^{-1} \log n)^r$ is the optimal rate if $q = \infty$.

Citation

Download Citation

Charles J. Stone. "Optimal Global Rates of Convergence for Nonparametric Regression." Ann. Statist. 10 (4) 1040 - 1053, December, 1982. https://doi.org/10.1214/aos/1176345969

Information

Published: December, 1982
First available in Project Euclid: 12 April 2007

zbMATH: 0511.62048
MathSciNet: MR673642
Digital Object Identifier: 10.1214/aos/1176345969

Subjects:
Primary: 62G20
Secondary: 62G05

Keywords: Nonparametric regression , Optimal rate of convergence

Rights: Copyright © 1982 Institute of Mathematical Statistics

Vol.10 • No. 4 • December, 1982
Back to Top