Abstract
In this paper we investigate the rates of convergence to zero of error probabilities in the estimation problem of a finite-valued parameter. It is shown that if a consistent estimator attains Bahadur's bound for the probability of incorrect decision at some parametric point then the error probability does not tend to zero exponentially fast for some other value of the parameter. We evaluate the minimal possible rate of convergence of this probability at a fixed parametric point for all asymptotically minimax procedures, and establish a necessary and sufficient condition for any of these procedures to have a constant risk. A simple example is constructed to demonstrate the asymptotical inadmissibility of the usual maximum likelihood estimator.
Citation
Andrew L. Rukhin. "Convergence Rates of Estimators of a Finite Parameter: How Small can Error Probabilities Be?." Ann. Statist. 11 (1) 202 - 207, March, 1983. https://doi.org/10.1214/aos/1176346070
Information