- Volume 23, Number 4A (2017), 2746-2783.
A generalized divergence for statistical inference
The power divergence (PD) and the density power divergence (DPD) families have proven to be useful tools in the area of robust inference. In this paper, we consider a superfamily of divergences which contains both of these families as special cases. The role of this superfamily is studied in several statistical applications, and desirable properties are identified and discussed. In many cases, it is observed that the most preferred minimum divergence estimator within the above collection lies outside the class of minimum PD or minimum DPD estimators, indicating that this superfamily has real utility, rather than just being a routine generalization. The limitation of the usual first order influence function as an effective descriptor of the robustness of the estimator is also demonstrated in this connection.
Bernoulli, Volume 23, Number 4A (2017), 2746-2783.
Received: December 2014
Revised: February 2016
First available in Project Euclid: 9 May 2017
Permanent link to this document
Digital Object Identifier
Mathematical Reviews number (MathSciNet)
Zentralblatt MATH identifier
Ghosh, Abhik; Harris, Ian R.; Maji, Avijit; Basu, Ayanendranath; Pardo, Leandro. A generalized divergence for statistical inference. Bernoulli 23 (2017), no. 4A, 2746--2783. doi:10.3150/16-BEJ826. https://projecteuclid.org/euclid.bj/1494316831
- Supplement to “A generalized divergence for statistical inference”. Supplement contains all the assumptions required for the asymptotic derivations and proofs of all the technical results presented in the paper. It also contains some remarks on the computation of MSDE and a simulation study illustrating the performance of the MSHDE under the bivariate normal model.