Journal of Applied Probability
- J. Appl. Probab.
- Volume 53, Number 2 (2016), 360-368.
Extension of de Bruijn's identity to dependent non-Gaussian noise channels
De Bruijn's identity relates two important concepts in information theory: Fisher information and differential entropy. Unlike the common practice in the literature, in this paper we consider general additive non-Gaussian noise channels where more realistically, the input signal and additive noise are not independently distributed. It is shown that, for general dependent signal and noise, the first derivative of the differential entropy is directly related to the conditional mean estimate of the input. Then, by using Gaussian and Farlie–Gumbel–Morgenstern copulas, special versions of the result are given in the respective case of additive normally distributed noise. The previous result on independent Gaussian noise channels is included as a special case. Illustrative examples are also provided.
J. Appl. Probab., Volume 53, Number 2 (2016), 360-368.
First available in Project Euclid: 17 June 2016
Permanent link to this document
Mathematical Reviews number (MathSciNet)
Zentralblatt MATH identifier
Khoolenjani, Nayereh Bagheri; Alamatsaz, Mohammad Hossein. Extension of de Bruijn's identity to dependent non-Gaussian noise channels. J. Appl. Probab. 53 (2016), no. 2, 360--368. https://projecteuclid.org/euclid.jap/1466172859