Bernoulli

Optimal design for curve estimation by local linear smoothing

Ming-Yen Cheng, Peter Hall, and D. Michael Titterington

Full-text: Open access

Abstract

The integral of the mean squared error of an estimator of a regression function is used as a criterion for defining an optimal design measure in the context of local linear regression, when the bandwidth is chosen in a locally optimal manner. An algorithm is proposed that constructs a sequence of piecewise uniform designs with the help of current estimates of the integral of the mean squared error. These estimates do not require direct estimation of the second derivative of the regression function. Asymptotic properties of the algorithm are established and numerical results illustrate the gains that can be made, relative to a uniform design, by using the optimal design or sub-optimal, piecewise uniform designs. The behaviour of the algorithm in practice is also illustrated.

Article information

Source
Bernoulli, Volume 4, Number 1 (1998), 3-14.

Dates
First available in Project Euclid: 6 April 2007

Permanent link to this document
https://projecteuclid.org/euclid.bj/1175865487

Mathematical Reviews number (MathSciNet)
MR1611863

Zentralblatt MATH identifier
0894.62085

Keywords
bandwidth choice local linear regression mean squared error nonlinear regression optimal design sequential design

Citation

Cheng, Ming-Yen; Hall, Peter; Michael Titterington, D. Optimal design for curve estimation by local linear smoothing. Bernoulli 4 (1998), no. 1, 3--14. https://projecteuclid.org/euclid.bj/1175865487


Export citation

References

  • [1] Chaloner, K. and Verdinelli, I. (1995) Bayesian experimental design: a review. Preprint.
  • [2] Cheng, B. and Titterington, D.M. (1994) Neural networks: a review from a statistical perspective (with discussion). Statist. Sci., 9, 2-54.
  • [3] Cheng, M.-Y., Hall, P. and Titterington, D.M. (1995) Optimal design for curve estimation by local linear smoothing. Research Report No. SRR046 95, Centre for Mathematics and its Applications, Australian National University.
  • [4] Cohn, D.A. (1994) Neural network exploration using optimal experimental design. MIT AI Laboratory Memo No. 1491.
  • [5] Fan, J. (1993) Local linear regression smoothers and their minimax efficiencies. Ann. Statist., 21, 196-216.
  • [6] Fedorov, V.V. (1972) Theory of Optimal Experiments. New York: Academic Press.
  • [7] Ford, I., Kitsos, C.P. and Titterington, D.M. (1989) Recent advances in nonlinear experimental designs. Technometrics, 31, 49-60.
  • [8] Härdle, W. (1990) Applied Nonparametric Regression. Cambridge: Cambridge University Press.
  • [9] Hart, J.D. (1991) Contribution to discussion of Chu and Marron (1991). Statist. Sci., 6, 425-527.
  • [10] Hastie, T. and Loader, C. (1993) Local regression: automatic kernel carpentry. Statist. Sci., 8, 120-143.
  • [11] Kiefer, J. (1959) Optimal experimental designs (with discussion). J. Roy. Statist. Soc. Ser. B, 21, 272-319.
  • [12] MacKay, D.J.C. (1992) Information-based objective functions for active data selection. Neural Comput., 4, 590-604.
  • [13] Marron, J.S. and Wand, M. (1992) Exact mean integrated squared error. Ann. Statist., 20, 712-736.
  • [14] Pukelsheim, F. (1993) Optimal Design of Experiments. New York: Wiley.
  • [15] Ruppert, D. and Wand, M.P. (1994) Multivariate locally weighted least squares regression. Ann. Statist., 22, 1346-1370.
  • [16] Seifert, B. and Gasser, T. (1996) Finite sample analysis of local polynomials: analysis and solutions. J. Amer. Statist. Assoc., 91, 267-275.
  • [17] Silvey, S.D. (1980) Optimal Design. London: Chapman & Hall.
  • [18] Wand, M.P. and Jones, M.C. (1995) Kernel Smoothing. London: Chapman & Hall.