The Annals of Statistics

A Lower Bound on the Error in Nonparametric Regression Type Problems

Yannis G. Yatracos

Full-text: Open access

Abstract

Let $(X_1, Y_1), \cdots, (X_n, Y_n)$ be a sample, denote the conditional density of $Y_i\mid X_i = x_i$ as $f(y\mid x_i, \theta(x_i))$ and $\theta$ an element of a metric space $(\Theta, d)$. A lower bound is provided for the $d$-error in estimating $\theta$. The order of the bound depends on the local behavior of the Kullback information of the conditional density. As an application, we consider the case where $\Theta$ is the space of $q$-smooth functions on $\lbrack 0, 1 \rbrack^d$ metrized with the $L_r$ distance, $1 \leq r < \infty$.

Article information

Source
Ann. Statist., Volume 16, Number 3 (1988), 1180-1187.

Dates
First available in Project Euclid: 12 April 2007

Permanent link to this document
https://projecteuclid.org/euclid.aos/1176350954

Digital Object Identifier
doi:10.1214/aos/1176350954

Mathematical Reviews number (MathSciNet)
MR959195

Zentralblatt MATH identifier
0651.62028

JSTOR
links.jstor.org

Subjects
Primary: 62G20: Asymptotic properties

Keywords
Nonparametric regression lower bound on minimax risk lower bound of loss in probability optimal rates of convergence Kullback information

Citation

Yatracos, Yannis G. A Lower Bound on the Error in Nonparametric Regression Type Problems. Ann. Statist. 16 (1988), no. 3, 1180--1187. doi:10.1214/aos/1176350954. https://projecteuclid.org/euclid.aos/1176350954


Export citation