Electronic Journal of Statistics

Uniform-in-bandwidth consistency for kernel-type estimators of Shannon’s entropy

Salim Bouzebda and Issam Elhattab

Full-text: Open access

Abstract

We establish uniform-in-bandwidth consistency for kernel-type estimators of the differential entropy. We consider two kernel-type estimators of Shannon’s entropy. As a consequence, an asymptotic 100% confidence interval of entropy is provided.

Article information

Source
Electron. J. Statist., Volume 5 (2011), 440-459.

Dates
First available in Project Euclid: 10 May 2011

Permanent link to this document
https://projecteuclid.org/euclid.ejs/1305034910

Digital Object Identifier
doi:10.1214/11-EJS614

Mathematical Reviews number (MathSciNet)
MR2802051

Zentralblatt MATH identifier
1274.62186

Subjects
Primary: 62F12: Asymptotic properties of estimators 62F03: Hypothesis testing 62G30: Order statistics; empirical distribution functions 60F17: Functional limit theorems; invariance principles 62E20: Asymptotic distribution theory

Keywords
Entropy kernel estimation uniform in bandwidth consistency

Citation

Bouzebda, Salim; Elhattab, Issam. Uniform-in-bandwidth consistency for kernel-type estimators of Shannon’s entropy. Electron. J. Statist. 5 (2011), 440--459. doi:10.1214/11-EJS614. https://projecteuclid.org/euclid.ejs/1305034910


Export citation

References

  • [1] Ahmad, I. A. and Lin, P. E. (1976). A nonparametric estimation of the entropy for absolutely continuous distributions., IEEE Trans. Information Theory, IT-22(3), 372–375.
  • [2] Akaike, H. (1954). An approximation to the density function., Ann. Inst. Statist. Math., Tokyo, 6, 127–132.
  • [3] Ash, R. (1965)., Information theory. Interscience Tracts in Pure and Applied Mathematics, No. 19. Interscience Publishers John Wiley & Sons, New York-London-Sydney.
  • [4] Beirlant, J., Dudewicz, E. J., Györfi, L., and van der Meulen, E. C. (1997). Nonparametric entropy estimation: an overview., Int. J. Math. Stat. Sci., 6(1), 17–39.
  • [5] Berger, T. (1971)., Rate distortion theory. Prentice-Hall Inc., Englewood Cliffs, N. J. A mathematical basis for data compression, Prentice-Hall Series in Information and System Sciences.
  • [6] Bosq, D. and Lecoutre, J.-P. (1987)., Théorie de l’estimation fonctionnelle. Économie et Statistiques Avancées. Economica, Paris.
  • [7] Bouzebda, S. and Elhattab, I. (2009). A strong consistency of a nonparametric estimate of entropy under random censorship., C. R. Math. Acad. Sci. Paris, 347(13-14), 821–826.
  • [8] Bouzebda, S. and Elhattab, I. (2010). Uniform in bandwidth consistency of the kernel-type estimator of the shannon’s entropy., C. R. Math. Acad. Sci. Paris, 348(5-6), 317–321.
  • [9] Clarkson, J. A. and Adams, C. R. (1933). On definitions of bounded variation for functions of two variables., Trans. Amer. Math. Soc., 35(4), 824–854.
  • [10] Cover, T. M. and Thomas, J. A. (2006)., Elements of information theory. Wiley-Interscience [John Wiley & Sons], Hoboken, NJ, second edition.
  • [11] Csiszár, I. (1962). Informationstheoretische Konvergenzbegriffe im Raum der Wahrscheinlichkeitsverteilungen., Magyar Tud. Akad. Mat. Kutató Int. Közl., 7, 137–158.
  • [12] Deheuvels, P. (2000). Uniform limit laws for kernel density estimators on possibly unbounded intervals. In, Recent advances in reliability theory (Bordeaux, 2000), Stat. Ind. Technol., pages 477–492. Birkhäuser Boston, Boston, MA.
  • [13] Deheuvels, P. and Mason, D. M. (2004). General asymptotic confidence bands based on kernel-type function estimators., Stat. Inference Stoch. Process., 7(3), 225–277.
  • [14] Devroye, L. (1987)., A Course in density estimation. Birkhäuser, Boston-Basel-Stuttgart.
  • [15] Devroye, L. and Györfi, L. (1985)., Nonparametric density estimation. Wiley Series in Probability and Mathematical Statistics: Tracts on Probability and Statistics. John Wiley & Sons Inc., New York. The L1 view.
  • [16] Devroye, L. and Lugosi, G. (2001)., Combinatorial methods in density estimation. Springer Series in Statistics. Springer-Verlag, New York.
  • [17] Devroye, L. and Wise, G. L. (1980). Detection of abnormal behavior via nonparametric estimation of the support., SIAM J. Appl. Math., 38(3), 480–488.
  • [18] Dmitriev, J. G. and Tarasenko, F. P. (1973). The estimation of functionals of a probability density and its derivatives., Teor. Verojatnost. i Primenen., 18, 662–668.
  • [19] Dudewicz, E. J. and van der Meulen, E. C. (1981). Entropy-based tests of uniformity., J. Amer. Statist. Assoc., 76(376), 967–974.
  • [20] Ebrahimi, N., Habibullah, M., and Soofi, E. (1992). Testing exponentiality based on Kullback-Leibler information., J. Roy. Statist. Soc. Ser. B, 54(3), 739–748.
  • [21] Eggermont, P. P. B. and LaRiccia, V. N. (1999). Best asymptotic normality of the kernel density entropy estimator for smooth densities., IEEE Trans. Inform. Theory, 45(4), 1321–1326.
  • [22] Einmahl, U. and Mason, D. M. (2000). An empirical process approach to the uniform consistency of kernel-type function estimators., J. Theoret. Probab., 13(1), 1–37.
  • [23] Einmahl, U. and Mason, D. M. (2005). Uniform in bandwidth consistency of kernel-type function estimators., Ann. Statist., 33(3), 1380–1403.
  • [24] Esteban, M. D., Castellanos, M. E., Morales, D., and Vajda, I. (2001). Monte Carlo comparison of four normality tests using different entropy estimates., Comm. Statist. Simulation Comput., 30(4), 761–785.
  • [25] Gallager, R. (1968)., Information theory and reliable communication. New York-London-Sydney-Toronto: John Wiley & Sons, Inc. XVI, 588 p.
  • [26] Giné, E. and Mason, D. M. (2008). Uniform in bandwidth estimation of integral functionals of the density function., Scand. J. Statist., 35(4), 739–761.
  • [27] Gokhale, D. (1983). On entropy-based goodness-of-fit tests., Comput. Stat. Data Anal., 1, 157–165.
  • [28] Gokhale, D. V. and Kullback, S. (1978)., The information in contingency tables, volume 23 of Statistics: Textbooks and Monographs. Marcel Dekker Inc., New York.
  • [29] Györfi, L. et van der Meulen, E. C. (1990). An entropy estimate based on a kernel density estimation. In, Limit theorems in probability and statistics (Pécs, 1989), volume 57 of Colloq. Math. Soc. János Bolyai, pages 229–240. North-Holland, Amsterdam.
  • [30] Györfi, L. and van der Meulen, E. C. (1991). On the nonparametric estimation of the entropy functional. In, Nonparametric functional estimation and related topics (Spetses, 1990), volume 335 of NATO Adv. Sci. Inst. Ser. C Math. Phys. Sci., pages 81–95. Kluwer Acad. Publ., Dordrecht.
  • [31] Hobson, E. W. (1958)., The theory of functions of a real variable and the theory of Fourier’s series. Vol. I. Dover Publications Inc., New York, N.Y.
  • [32] Jaynes, E. T. (1957). Information theory and statistical mechanics., Phys. Rev. (2), 106, 620–630.
  • [33] Kullback, S. (1959)., Information theory and statistics. John Wiley and Sons, Inc., New York.
  • [34] Lazo, A. C. and Rathie, P. N. (1978). On the entropy of continuous probability distributions., IEEE Trans. Inf. Theory, 24, 120–122.
  • [35] Louani, D. (2005). Uniform, L1-distance large deviations in nonparametric density estimation. Test, 14(1), 75–98.
  • [36] Parzen, E. (1962). On estimation of a probability density function and mode., Ann. Math. Statist., 33, 1065–1076.
  • [37] Prakasa Rao, B. L. S. (1983)., Nonparametric functional estimation. Probability and Mathematical Statistics. Academic Press Inc. [Harcourt Brace Jovanovich Publishers], New York.
  • [38] Prescott, P. (1976). On a test for normality based on sample entropy., J. R. Stat. Soc., Ser. B, 38, 254–256.
  • [39] Rényi, A. (1959). On the dimension and entropy of probability distributions., Acta Math. Acad. Sci. Hungar., 10, 193–215 (unbound insert).
  • [40] Rosenblatt, M. (1956). Remarks on some nonparametric estimates of a density function., Ann. Math. Statist., 27, 832–837.
  • [41] Scott, D. W. (1992)., Multivariate density estimation. Wiley Series in Probability and Mathematical Statistics: Applied Probability and Statistics. John Wiley & Sons Inc., New York. Theory, practice, and visualization, A Wiley-Interscience Publication.
  • [42] Shannon, C. E. (1948). A mathematical theory of communication., Bell System Tech. J., 27, 379–423, 623–656.
  • [43] Song, K.-S. (2000). Limit theorems for nonparametric sample entropy estimators., Statist. Probab. Lett., 49(1), 9–18.
  • [44] Talagrand, M. (1994). Sharper bounds for Gaussian and empirical processes., Ann. Probab., 22(1), 28–76.
  • [45] Vasicek, O. (1976). A test for normality based on sample entropy., J. Roy. Statist. Soc. Ser. B, 38(1), 54–59.
  • [46] Viallon, V. (2006)., Processus empiriques, estimation non paramétrique et données censurées. Ph.D. thesis, Université Paris 6.
  • [47] Vituškin, A. G. (1955)., O mnogomernyh variaciyah. Gosudarstv. Izdat. Tehn.-Teor. Lit., Moscow.