Open Access
2010 Compound Poisson Approximation via Information Functionals
A. D. Barbour, Oliver Johnson, Ioannis Kontoyiannis, Mokshay Madiman
Author Affiliations +
Electron. J. Probab. 15: 1344-1369 (2010). DOI: 10.1214/EJP.v15-799

Abstract

An information-theoretic development is given for the problem of compound Poisson approximation, which parallels earlier treatments for Gaussian and Poisson approximation. Nonasymptotic bounds are derived for the distance between the distribution of a sum of independent integer-valued random variables and an appropriately chosen compound Poisson law. In the case where all summands have the same conditional distribution given that they are non-zero, a bound on the relative entropy distance between their sum and the compound Poisson distribution is derived, based on the data-processing property of relative entropy and earlier Poisson approximation results. When the summands have arbitrary distributions, corresponding bounds are derived in terms of the total variation distance. The main technical ingredient is the introduction of two "information functionals,'' and the analysis of their properties. These information functionals play a role analogous to that of the classical Fisher information in normal approximation. Detailed comparisons are made between the resulting inequalities and related bounds.

Citation

Download Citation

A. D. Barbour. Oliver Johnson. Ioannis Kontoyiannis. Mokshay Madiman. "Compound Poisson Approximation via Information Functionals." Electron. J. Probab. 15 1344 - 1369, 2010. https://doi.org/10.1214/EJP.v15-799

Information

Accepted: 31 August 2010; Published: 2010
First available in Project Euclid: 1 June 2016

zbMATH: 1225.60037
MathSciNet: MR2721049
Digital Object Identifier: 10.1214/EJP.v15-799

Subjects:
Primary: 60E15
Secondary: 60E07 , 60F05 , 94A17

Keywords: compound Poisson approximation , Fisher information , information theory , Relative entropy , Stein's method

Vol.15 • 2010
Back to Top