Brazilian Journal of Probability and Statistics

Estimating the Renyi entropy of several exponential populations

Suchandan Kayal, Somesh Kumar, and P. Vellaisamy

Full-text: Open access


Suppose independent random samples are drawn from $k$ shifted exponential populations with a common location but unequal scale parameters. The problem of estimating the Renyi entropy is considered. The uniformly minimum variance unbiased estimator (UMVUE) is derived. Sufficient conditions for improvement over affine and scale equivariant estimators are obtained. As a consequence, improved estimators over the UMVUE and the maximum likelihood estimator (MLE) are obtained. Further, for the case $k=1$, an estimator that dominates the best affine equivariant estimator is derived. Cases when the location parameter is constrained are also investigated in detail.

Article information

Braz. J. Probab. Stat., Volume 29, Number 1 (2015), 94-111.

First available in Project Euclid: 30 October 2014

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Entropy estimation equivariance improved estimators MLE shifted exponential UMVUE


Kayal, Suchandan; Kumar, Somesh; Vellaisamy, P. Estimating the Renyi entropy of several exponential populations. Braz. J. Probab. Stat. 29 (2015), no. 1, 94--111. doi:10.1214/13-BJPS230.

Export citation


  • Ahmed, N. A. and Gokhale, D. V. (1989). Entropy expressions and their estimators for multivariate distributions. IEEE Transactions on Information Theory 35, 688–692.
  • Brewster, J. F. and Zidek, J. V. (1974). Improving on equivariant estimators. The Annals of Statistics 2, 21–38.
  • Cover, T. M. and Thomas, J. A. (2006). Elements of Information Theory. New York: Wiley.
  • Ghosh, M. and Razmpour, A. (1984). Estimation of the common location parameter of several exponentials. Sankhyā, Ser. A 46, 383–394.
  • Grubbs, F. E. (1971). Approximate fiducial bounds on reliability for the two parameter negative exponential distribution. Technometrics 13, 873–876.
  • Harte, D. and Jones, D. V. (2005). The entropy and its uses in earthquake forecasting. Pure and Applied Geophysics 162, 1229–1253.
  • Kayal, S. and Kumar, S. (2013). Estimation of the Shannon’s entropy of several shifted exponential populations. Statistics & Probability Letters 83, 1127–1135.
  • Khinchin, A. I. (1957). Mathematical Foundations of Information Theory. New York: Dover.
  • Lehmann, E. L. and Romano, J. P. (2005). Testing Statistical Hypotheses. New York: Springer.
  • Misra, N., Singh, H. and Demchuk, E. (2005). Estimation of the entropy of multivariate normal distribution. Journal of Multivariate Analysis 92, 324–342.
  • Renyi, A. (1961). On measures of entropy and information. In Proceedings of Fourth Berkeley Symposium on Mathematics, Statistics and Probability, Vol. 1 547–461. Berkeley: Univ. California Press.
  • Robinson, D. W. (2008). Entropy and uncertainty. Entropy 10, 493–506.
  • Shannon, C. (1948). A mathematical theory of communication. The Bell System Technical Journal 27, 379–423.