The Annals of Statistics

Approximation of Least Squares Regression on Nested Subspaces

Dennis D. Cox

Full-text: Open access

Abstract

For a regression model $y_i = \theta(x_i) + \varepsilon_i$, the unknown function $\theta$ is estimated by least squares on a subspace $\Lambda_m = \operatorname{span}\{\psi_1, \psi, \cdots, \psi_m\}$, where the basis functions $\psi_i$ are predetermined and $m$ is varied. Assuming that the design is suitably approximated by an asymptotic design measure, a general method is presented for approximating the bias and variance in a scale of Hilbertian norms natural to the problem. The general theory is illustrated with two examples: truncated Fourier series regression and polynomial regression. For these examples, we give rates of convergence of derivative estimates in (weighted) $L_2$ norms and establish consistency in supremum norm.

Article information

Source
Ann. Statist., Volume 16, Number 2 (1988), 713-732.

Dates
First available in Project Euclid: 12 April 2007

Permanent link to this document
https://projecteuclid.org/euclid.aos/1176350830

Digital Object Identifier
doi:10.1214/aos/1176350830

Mathematical Reviews number (MathSciNet)
MR947572

Zentralblatt MATH identifier
0669.62047

JSTOR
links.jstor.org

Subjects
Primary: 62J05: Linear regression
Secondary: 62F12: Asymptotic properties of estimators 41A10: Approximation by polynomials {For approximation by trigonometric polynomials, see 42A10}

Keywords
Regression nonparametric regression bias approximation polynomial regression model selection rates of convergence orthogonal polynomials

Citation

Cox, Dennis D. Approximation of Least Squares Regression on Nested Subspaces. Ann. Statist. 16 (1988), no. 2, 713--732. doi:10.1214/aos/1176350830. https://projecteuclid.org/euclid.aos/1176350830


Export citation