Abstract
Consider the problem of estimating $\mu$, based on the observation of $Y_0, Y_1, \ldots, Y_n$, where it is assumed only that $Y_0, Y_1, \ldots, Y_\kappa \operatorname{iid} N(\mu, \sigma^2)$ for some unknown $\kappa$. Unlike the traditional change-point problem, the focus here is not on estimating $\kappa$, which is now a nuisance parameter. When it is known that $\kappa = k$, the sample mean $\bar{Y}_k = \sum^k_0Y_i/(k + 1)$, provides, in addition to wonderful efficiency properties, safety in the sense that it is minimax under squared error loss. Unfortunately, this safety breaks down when $\kappa$ is unknown; indeed if $k > \kappa$, the risk of $\bar{Y}_k$ is unbounded. To address this problem, a generalized minimax criterion is considered whereby each estimator is evaluated by its maximum risk under $Y_0, Y_1, \ldots, Y_\kappa \operatorname{iid} N(\mu, \sigma^2)$ for each possible value of $\kappa$. An essentially complete class under this criterion is obtained. Generalizations to other situations such as variance estimation are illustrated.
Citation
Dean P. Foster. Edward I. George. "Estimation up to a Change-Point." Ann. Statist. 21 (2) 625 - 644, June, 1993. https://doi.org/10.1214/aos/1176349141
Information