Electronic Journal of Probability

Compound Poisson Approximation via Information Functionals

A. D. Barbour, Oliver Johnson, Ioannis Kontoyiannis, and Mokshay Madiman

Full-text: Open access

Abstract

An information-theoretic development is given for the problem of compound Poisson approximation, which parallels earlier treatments for Gaussian and Poisson approximation. Nonasymptotic bounds are derived for the distance between the distribution of a sum of independent integer-valued random variables and an appropriately chosen compound Poisson law. In the case where all summands have the same conditional distribution given that they are non-zero, a bound on the relative entropy distance between their sum and the compound Poisson distribution is derived, based on the data-processing property of relative entropy and earlier Poisson approximation results. When the summands have arbitrary distributions, corresponding bounds are derived in terms of the total variation distance. The main technical ingredient is the introduction of two "information functionals,'' and the analysis of their properties. These information functionals play a role analogous to that of the classical Fisher information in normal approximation. Detailed comparisons are made between the resulting inequalities and related bounds.

Article information

Source
Electron. J. Probab., Volume 15 (2010), paper no. 42, 1344-1369.

Dates
Accepted: 31 August 2010
First available in Project Euclid: 1 June 2016

Permanent link to this document
https://projecteuclid.org/euclid.ejp/1464819827

Digital Object Identifier
doi:10.1214/EJP.v15-799

Mathematical Reviews number (MathSciNet)
MR2721049

Zentralblatt MATH identifier
1225.60037

Subjects
Primary: 60E15: Inequalities; stochastic orderings
Secondary: 60E07: Infinitely divisible distributions; stable distributions 60F05: Central limit and other weak theorems 94A17: Measures of information, entropy

Keywords
Compound Poisson approximation Fisher information information theory relative entropy Stein's method

Rights
This work is licensed under aCreative Commons Attribution 3.0 License.

Citation

Barbour, A. D.; Johnson, Oliver; Kontoyiannis, Ioannis; Madiman, Mokshay. Compound Poisson Approximation via Information Functionals. Electron. J. Probab. 15 (2010), paper no. 42, 1344--1369. doi:10.1214/EJP.v15-799. https://projecteuclid.org/euclid.ejp/1464819827


Export citation

References

  • Aldous, David. Probability approximations via the Poisson clumping heuristic. Applied Mathematical Sciences, 77. Springer-Verlag, New York, 1989. xvi+269 pp. ISBN: 0-387-96899-7
  • Artstein, Shiri; Ball, Keith M.; Barthe, Franck; Naor, Assaf. On the rate of convergence in the entropic central limit theorem. Probab. Theory Related Fields 129 (2004), no. 3, 381–390.
  • Artstein, Shiri; Ball, Keith M.; Barthe, Franck; Naor, Assaf. Solution of Shannon's problem on the monotonicity of entropy. J. Amer. Math. Soc. 17 (2004), no. 4, 975–982 (electronic).
  • Barbour, A. D.; Chen, Louis H. Y. Stein's method and applications. Proceedings of the workshop held in Singapore, July 28–August 31, 2003. Edited by A. D. Barbour and Louis H. Y. Chen. Lecture Notes Series. Institute for Mathematical Sciences. National University of Singapore, 5. Published jointly by Singapore University Press, Singapore; and World Scientific Publishing Co. Pte. Ltd., Hackensack, NJ, 2005. xx+297 pp. ISBN: 981-256-281-8
  • Barbour, A. D.; Chen, Louis H. Y.; Loh, Wei-Liem. Compound Poisson approximation for nonnegative random variables via Stein's method. Ann. Probab. 20 (1992), no. 4, 1843–1866.
  • Barbour, A. D.; Chryssaphinou, O. Compound Poisson approximation: a user's guide. Ann. Appl. Probab. 11 (2001), no. 3, 964–1002.
  • Barbour, A. D.; Holst, Lars; Janson, Svante. Poisson approximation. Oxford Studies in Probability, 2. Oxford Science Publications. The Clarendon Press, Oxford University Press, New York, 1992. x+277 pp. ISBN: 0-19-852235-5.
  • Barron, Andrew R. Entropy and the central limit theorem. Ann. Probab. 14 (1986), no. 1, 336–342.
  • Bobkov, S. G.; Ledoux, M. On modified logarithmic Sobolev inequalities for Bernoulli and Poisson measures. J. Funct. Anal. 156 (1998), no. 2, 347–365.
  • Borisov, I. S.; VorozheÄ­kin, I. S. Accuracy of approximation in the Poisson theorem in terms of $\chi\sp 2$ distance. (Russian) Sibirsk. Mat. Zh. 49 (2008), no. 1, 8–22; translation in Sib. Math. J. 49 (2008), no. 1, 5–17
  • Čekanavičius, V.; Roos, B. An expansion in the exponent for compound binomial approximations. Liet. Mat. Rink. 46 (2006), no. 1, 67–110; translation in Lithuanian Math. J. 46 (2006), no. 1, 54–91
  • Cover, Thomas M.; Thomas, Joy A. Elements of information theory. Wiley Series in Telecommunications. A Wiley-Interscience Publication. John Wiley & Sons, Inc., New York, 1991. xxiv+542 pp. ISBN: 0-471-06259-6
  • Csiszár, Ie; Körner, János. Information theory. Coding theorems for discrete memoryless systems. Probability and Mathematical Statistics. Academic Press, Inc. [Harcourt Brace Jovanovich, Publishers], New York-London, 1981. xi+452 pp. ISBN: 0-12-198450-8
  • Diaconis, Persi; Holmes, Susan Stein's method: expository lectures and applications. Papers from the Workshop on Stein's Method held at Stanford University, Stanford, CA, 1998. Edited by Persi Diaconis and Susan Holmes. Institute of Mathematical Statistics Lecture Notes–-Monograph Series, 46. Institute of Mathematical Statistics, Beachwood, OH, 2004. vi+139 pp. ISBN: 0-940600-62-5
  • Erhardsson, Torkel. Stein's method for Poisson and compound Poisson approximation. An introduction to Stein's method, 61–113, Lect. Notes Ser. Inst. Math. Sci. Natl. Univ. Singap., 4, Singapore Univ. Press, Singapore, 2005.
  • Harremoës, Peter. Binomial and Poisson distributions as maximum entropy distributions. IEEE Trans. Inform. Theory 47 (2001), no. 5, 2039–2041.
  • Johnson, Oliver. Information theory and the central limit theorem. Imperial College Press, London, 2004. xiv+209 pp. ISBN: 1-86094-473-6
  • Johnson, Oliver. Log-concavity and the maximum entropy property of the Poisson distribution. Stochastic Process. Appl. 117 (2007), no. 6, 791–802.
  • Johnson, Oliver; Barron, Andrew. Fisher information inequalities and the central limit theorem. Probab. Theory Related Fields 129 (2004), no. 3, 391–409.
  • Johnson, Oliver; Kontoyiannis, Ioannis; Madiman, Mokshay. Log-concavity, ultra-log-concavity and a maximum entropy property of discrete compound Poisson measures. Preprint, October 2009. Earlier version online at arXiv:0805.4112v1, May 2008.
  • Johnstone, Iain M.; MacGibbon, Brenda. Une mesure d'information caractérisant la loi de Poisson. (French) [An information measure characterizing the Poisson distribution] Séminaire de Probabilités, XXI, 563–573, Lecture Notes in Math., 1247, Springer, Berlin, 1987.
  • Kagan, Abram. A discrete version of the Stam inequality and a characterization of the Poisson distribution. J. Statist. Plann. Inference 92 (2001), no. 1-2, 7–12.
  • Kontoyiannis, Ioannis; Harremoës, Peter; Johnson, Oliver. Entropy and the law of small numbers. IEEE Trans. Inform. Theory 51 (2005), no. 2, 466–472.
  • Kontoyiannis, I.; Madiman, M. Measure concentration for compound Poisson distributions. Electron. Comm. Probab. 11 (2006), 45–57 (electronic).
  • Le Cam, Lucien. An approximation theorem for the Poisson binomial distribution. Pacific J. Math. 10 1960 1181–1197.
  • Le Cam, Lucien. On the distribution of sums of independent random variables. 1965 Proc. Internat. Res. Sem., Statist. Lab., Univ. California, Berkeley, Calif. pp. 179–202 Springer-Verlag, New York
  • Madiman, Mokshay. Topics in information theory, probability, and statistics. Thesis (Ph.D.)–Brown University. ProQuest LLC, Ann Arbor, MI, 2006. 166 pp. ISBN: 978-0542-82065-6
  • Madiman, Mokshay; Barron, Andrew. Generalized entropy power inequalities and monotonicity properties of information. IEEE Trans. Inform. Theory 53 (2007), no. 7, 2317–2329.
  • Matsunawa, T. Some strong $\varepsilon$-equivalence of random variables. Ann. Inst. Statist. Math. 34 (1982), no. 2, 209–224.
  • Michel, R. An improved error bound for the compound Poisson approximation of a nearly homogeneous portfolio. ASTIN Bull. 17 (1987), 165–169.
  • Romanowska, Malgorzata. A note on the upper bound for the distrance in total variation between the binomial and the Poisson distribution. Statistica Neerlandica 31 (1977), no. 3, 127–130.
  • Roos, Bero. Sharp constants in the Poisson approximation. Statist. Probab. Lett. 52 (2001), no. 2, 155–168.
  • Roos, Bero. Kerstan's method for compound Poisson approximation. Ann. Probab. 31 (2003), no. 4, 1754–1771.
  • Topsøe, Flemming. Maximum entropy versus minimum risk and applications to some classical discrete distributions. IEEE Trans. Inform. Theory 48 (2002), no. 8, 2368–2376.
  • Tulino, Antonia M.; Verdú, Sergio. Monotonic decrease of the non-Gaussianness of the sum of independent random variables: a simple proof. IEEE Trans. Inform. Theory 52 (2006), no. 9, 4295–4297.
  • Vervaat, W. Upper bounds for the distance in total variation between the binomial or negative binomial and the Poisson distribution. Statistica Neerlandica 23 1969 79–86.
  • Wu, Liming. A new modified logarithmic Sobolev inequality for Poisson point processes and several applications. Probab. Theory Related Fields 118 (2000), no. 3, 427–438.
  • Yu, Yaming. On the entropy of compound distributions on nonnegative integers. IEEE Trans. Inform. Theory 55 (2009), no. 8, 3645–3650. (Review)