Annals of Statistics
- Ann. Statist.
- Volume 45, Number 4 (2017), 1638-1663.
Robust discrimination designs over Hellinger neighbourhoods
To aid in the discrimination between two, possibly nonlinear, regression models, we study the construction of experimental designs. Considering that each of these two models might be only approximately specified, robust “maximin” designs are proposed. The rough idea is as follows. We impose neighbourhood structures on each regression response, to describe the uncertainty in the specifications of the true underlying models. We determine the least favourable—in terms of Kullback–Leibler divergence—members of these neighbourhoods. Optimal designs are those maximizing this minimum divergence. Sequential, adaptive approaches to this maximization are studied. Asymptotic optimality is established.
Ann. Statist., Volume 45, Number 4 (2017), 1638-1663.
Received: April 2016
Revised: June 2016
First available in Project Euclid: 28 June 2017
Permanent link to this document
Digital Object Identifier
Mathematical Reviews number (MathSciNet)
Zentralblatt MATH identifier
Hu, Rui; Wiens, Douglas P. Robust discrimination designs over Hellinger neighbourhoods. Ann. Statist. 45 (2017), no. 4, 1638--1663. doi:10.1214/16-AOS1503. https://projecteuclid.org/euclid.aos/1498636869
- Supplement to “Robust discrimination designs over Hellinger neighbourhoods”. There we give the rather lengthy proof of Theorem 2.1, which depends on a number of preliminary lemmas. We also show that the conditions of this theorem apply to normal and log-normal densities.