The Annals of Mathematical Statistics

Optimum Designs in Regression Problems, II

J. Kiefer

Full-text: Open access

Abstract

Extending the results of Kiefer and Wolfowitz [10], [11], methods are obtained for characterizing and computing optimum regression designs in various settings, and examples are given where $D$-optimum designs are computed. In Section 1 we introduce the main definitions and notation which will be used in the paper, and discuss briefly the roles of invariance, randomization, number of points at which observations are taken, and nonlinearity of the model, in our results. In Section 2 we prove the main theoretical results. We are concerned with the estimation of $s$ out of the $k$ parameters, extending an approach developed in [10] and [11] in the case $s = k$. There is no direct way of ascertaining whether or not a given design $\xi^{\ast}$ is $D$-optimum for (minimizes the generalized variance of the best linear estimators of) the $s$ chosen parameters, and T+eorems 1 and 2 provide algorithms for determining whether or not a given $\xi^\ast$ is $D$-optimum. If all $k$ parameters are estimable under $\xi^\ast$, we can use (2.7) to decide whether $\xi^\ast$ is $D$-optimum, while if not all $k$ parameters are estimable we must use the somewhat more complicated condition (2.17) (of which part (a) or (b) is necessary for optimality, while (a), (c), or (d) is sufficient). An addition to Theorem 2 near the end of Section 3 provides assistance in using (2.17) (b). Theorem 3 of Section 2 characterizes the set of information matrices of the $D$-optimum designs. In Section 3 we give a geometric interpretation of the results of Section 2, and compare the present approach with that of [10]. In the case $s = k$, the present approach reduces to that of Section 5 of [10] and of [11]. When $1 < s < k$, we obtain an algorithm which differs from that of Section 4 of [10] and which appears to be computationally easier to use. When $s = 1$, the results of the present paper are shown to reduce to those of Section 2 of [10]; in particular, we obtain the game-theoretic results without using the game-theoretic machinery of [10]. In Section 4 we determine $D$-optimum designs for the problems of quadratic regression on a $q$-cube and polynomial regression on a real interval with $1 < s < k$. Part II of the paper is devoted entirely to the determination of $D$-optimum designs for various problems in the setting of simplex designs considered by Scheffe [12]. Various unsolved problems are mentioned throughout the paper. Further examples will be published elsewhere.

Article information

Source
Ann. Math. Statist., Volume 32, Number 1 (1961), 298-325.

Dates
First available in Project Euclid: 27 April 2007

Permanent link to this document
https://projecteuclid.org/euclid.aoms/1177705160

Digital Object Identifier
doi:10.1214/aoms/1177705160

Mathematical Reviews number (MathSciNet)
MR123408

Zentralblatt MATH identifier
0099.13502

JSTOR
links.jstor.org

Citation

Kiefer, J. Optimum Designs in Regression Problems, II. Ann. Math. Statist. 32 (1961), no. 1, 298--325. doi:10.1214/aoms/1177705160. https://projecteuclid.org/euclid.aoms/1177705160


Export citation

See also

  • Part I: J. Kiefer, J. Wolfowitz. Optimum Designs in Regression Problems. Ann. Math. Statist., Volume 30, Number 2 (1959), 271--294.