Abstract
The performance of a sequence of estimators $\{T_n\}$ of $\theta$ can be measured by the probability concentration of the estimator in an $\varepsilon_n$-neighborhood of $\theta$. Classical choices of $\varepsilon_n$ are $\varepsilon_n = cn^{-1/2}$ (contiguous case) and $\varepsilon_n = \varepsilon$ fixed for all $n$ (non-local case). In this article all sequences $\{\varepsilon_n\}$ with $\lim_{n\rightarrow\infty} \varepsilon_n = 0$ and $\lim_{n\rightarrow\infty} \varepsilon_nn^{1/2} = \infty$ are considered. In that way the statistically important choices of small $\varepsilon$'s are investigated in a uniform sense; in that way the importance and usefulness of classical results concerning local or non-local efficiency can gather strength by extending to larger regions of neighborhoods; in that way one can investigate where optimality passes into non-optimality if for instance an estimator is locally efficient and non-locally non-efficient. The theory of moderate deviation and Cramer-type large deviation probabilities plays an important role in this context. Examples of the performance of particularly maximum likelihood estimators are presented in $k$-parameter exponential families, a curved exponential family and the double-exponential family.
Citation
Wilbert C. M. Kallenberg. "On Moderate Deviation Theory in Estimation." Ann. Statist. 11 (2) 498 - 504, June, 1983. https://doi.org/10.1214/aos/1176346156
Information