The Annals of Statistics

Information theory and superefficiency

Andrew Barron and Nicolas Hengartner

Full-text: Open access

Abstract

The asymptotic risk of efficient estimators with Kullback–Leibler loss in smoothly parametrized statistical models is $k/2_n$, where $k$ is the parameter dimension and $n$ is the sample size. Under fairly general conditions, we given a simple information-theoretic proof that the set of parameter values where any arbitrary estimator is superefficient is negligible. The proof is based on a result of Rissanen that codes have asymptotic redundancy not smaller than $(k/2)\log n$, except in a set of measure 0.

Article information

Source
Ann. Statist., Volume 26, Number 5 (1998), 1800-1825.

Dates
First available in Project Euclid: 21 June 2002

Permanent link to this document
https://projecteuclid.org/euclid.aos/1024691358

Digital Object Identifier
doi:10.1214/aos/1024691358

Mathematical Reviews number (MathSciNet)
MR1673279

Zentralblatt MATH identifier
0932.62005

Subjects
Primary: 62F12 94A65
Secondary: 94A29 62G20

Keywords
Superefficiency information theory data compression Kullback–Leibler loss

Citation

Barron, Andrew; Hengartner, Nicolas. Information theory and superefficiency. Ann. Statist. 26 (1998), no. 5, 1800--1825. doi:10.1214/aos/1024691358. https://projecteuclid.org/euclid.aos/1024691358


Export citation

References

  • ALI, S. and SILVEY, S. 1966. A general class of coefficient of divergence of one distribution from another. J. Roy. Statist. Soc. Ser. B 28 131 142. Z.
  • BAHADUR, R. R. 1964. On Fisher's bound for asy mptotic variances. Ann. Math. Statist. 35 1545 1552. Z.
  • BAHADUR, R. R. 1967. Rates of convergence of estimates and test statistics. Ann. Math. Statist. 38 303 324. Z.
  • BAHADUR, R. R. 1971. Some Limit Theorems in Statistics. SIAM, Philadelphia. Z.
  • BARRON, A. 1987. Are Bay es rules consistent in information? In Open Problems in CommunicaZ. tion and Computation T. Cover and B. Gopinath. eds. 85 91. Springer, New York. Z.
  • BARRON, A. and COVER, T. 1991. Minimum complexity density estimation. IEEE Trans. Inform. Theory 37 1034 1054. Z.
  • BIRGE, L. 1986. On estimating a density using Hellinger distance and some other strange facts. ´ Probab. Theory Related Fields 71 271 291. Z.
  • BROWN, L. D. 1993. An information inequality for the Bay es risk under truncated squared error Z. loss. In Multivariate Analy sis: Future Directions C. R. Rao, ed.. North-Holland, Amsterdam. Z.
  • BROWN, L. D., LOW. M. G. and ZHAO, L. H. 1997. Superefficiency in nonparametric function estimation. Ann. Statist. 25 2607 2625.
  • CENCOV, N. N. 1982. Statistical decision rules and optimal inference. Amer. Math. Soc. Transl. Ser. 2 53. Z.
  • CLARKE, B. and BARRON, A. 1990. Information-theoretic asy mptotics of Bay es methods. IEEE Trans. Inform. Theory 36 453 471. Z.
  • COVER, T. and THOMAS J. 1991. Information Theory. Wiley, New York. Z.
  • CSISZAR, I. 1967. Information-ty pe measures of difference of probability distributions and ´ indirect observations. Studia Sci. Math. Hungar. 2 299 318. Z.
  • HALMOS, P. R. 1988. Measure Theory, 4th, ed. Springer, New York. Z.
  • HARTIGAN, J. A. 1983. Bay es Theory. Springer, New York. Z.
  • HARTIGAN, J. A. 1998. The maximum likelihood prior. Ann. Statist. 26 2083 2103. Z.
  • JEFFREy S, H. 1946. An invariant form for the prior probability in estimation problems. Proc. Roy. Soc. London Ser. A 186 453 461. Z.
  • KOMAKI, F. 1994. On asy mptotic properties of predictive distributions. Technical Report, Dept. of Mathematical Engineering and Information physics, Faculty of Engineering, Univ. Toky o. Z.
  • KULLBACK, S. and LEIBLER, R. 1951. Information and sufficiency. Ann. Math. Statist. 22 79 86. Z.
  • IBRAGIMOV, I. A. and HASMINSKII, R. Z. 1982. Statistical Estimation: Asy mptotical Theory. Applications of Mathematics. Springer, New York. Z.
  • LE CAM, L. 1953. On some asy mptotic properties of maximum likelihood estimates and related Bay es estimates. Univ. California Publ. Statist. 277 330. Z.
  • LE CAM, L. 1986. Asy mptotic Methods in Statistical Decision Theory. Springer, New York. Z. LORENTZ 1966. Metric entropy and approximation. Bull. Amer. Math. Soc. 72 903 937. Z.
  • MERHAV, N. and FEDER, M. 1995. A strong version of the redundancy capacity theorem of universal coding. IEEE Trans. Inform. Theory 41 714 722. Z.
  • RISSANEN, J. 1984. Universal coding, information, prediction, and estimation. IEEE Trans. Inform. Theory 30 629 636. Z.
  • RISSANEN, J. 1986. Stochastic complexity and modeling. Ann. Statist. 14 1080 1100. Z. VOVK 1991. Asy mptotic efficiency of estimators: algorithmic approach. Theory Probab. Appl. 36 329 343. Z.
  • YANG, Y. and BARRON, A. 1995. Information theoretic determination of minimax rates of convergence. Unpublished manuscript.
  • NEW HAVEN, CONNECTICUT 06520-8290 E-MAIL: barron@stat.yale.edu nicolas.hengartner@yale.edu