## The Annals of Statistics

### Approximation of Least Squares Regression on Nested Subspaces

Dennis D. Cox

#### Abstract

For a regression model $y_i = \theta(x_i) + \varepsilon_i$, the unknown function $\theta$ is estimated by least squares on a subspace $\Lambda_m = \operatorname{span}\{\psi_1, \psi, \cdots, \psi_m\}$, where the basis functions $\psi_i$ are predetermined and $m$ is varied. Assuming that the design is suitably approximated by an asymptotic design measure, a general method is presented for approximating the bias and variance in a scale of Hilbertian norms natural to the problem. The general theory is illustrated with two examples: truncated Fourier series regression and polynomial regression. For these examples, we give rates of convergence of derivative estimates in (weighted) $L_2$ norms and establish consistency in supremum norm.

#### Article information

Source
Ann. Statist., Volume 16, Number 2 (1988), 713-732.

Dates
First available in Project Euclid: 12 April 2007

https://projecteuclid.org/euclid.aos/1176350830

Digital Object Identifier
doi:10.1214/aos/1176350830

Mathematical Reviews number (MathSciNet)
MR947572

Zentralblatt MATH identifier
0669.62047

JSTOR