Open Access
October 1998 Information theory and superefficiency
Andrew Barron, Nicolas Hengartner
Ann. Statist. 26(5): 1800-1825 (October 1998). DOI: 10.1214/aos/1024691358

Abstract

The asymptotic risk of efficient estimators with Kullback–Leibler loss in smoothly parametrized statistical models is $k/2_n$, where $k$ is the parameter dimension and $n$ is the sample size. Under fairly general conditions, we given a simple information-theoretic proof that the set of parameter values where any arbitrary estimator is superefficient is negligible. The proof is based on a result of Rissanen that codes have asymptotic redundancy not smaller than $(k/2)\log n$, except in a set of measure 0.

Citation

Download Citation

Andrew Barron. Nicolas Hengartner. "Information theory and superefficiency." Ann. Statist. 26 (5) 1800 - 1825, October 1998. https://doi.org/10.1214/aos/1024691358

Information

Published: October 1998
First available in Project Euclid: 21 June 2002

zbMATH: 0932.62005
MathSciNet: MR1673279
Digital Object Identifier: 10.1214/aos/1024691358

Subjects:
Primary: 62F12 , 94A65
Secondary: 62G20 , 94A29

Keywords: data compression , information theory , Kullback–Leibler loss , superefficiency

Rights: Copyright © 1998 Institute of Mathematical Statistics

Vol.26 • No. 5 • October 1998
Back to Top