## Journal of Applied Probability

- J. Appl. Probab.
- Volume 53, Number 2 (2016), 360-368.

### Extension of de Bruijn's identity to dependent non-Gaussian noise channels

Nayereh Bagheri Khoolenjani and Mohammad Hossein Alamatsaz

#### Abstract

De Bruijn's identity relates two important concepts in information theory: Fisher information and differential entropy. Unlike the common practice in the literature, in this paper we consider general additive non-Gaussian noise channels where more realistically, the input signal and additive noise are not independently distributed. It is shown that, for general dependent signal and noise, the first derivative of the differential entropy is directly related to the conditional mean estimate of the input. Then, by using Gaussian and Farlie–Gumbel–Morgenstern copulas, special versions of the result are given in the respective case of additive normally distributed noise. The previous result on independent Gaussian noise channels is included as a special case. Illustrative examples are also provided.

#### Article information

**Source**

J. Appl. Probab., Volume 53, Number 2 (2016), 360-368.

**Dates**

First available in Project Euclid: 17 June 2016

**Permanent link to this document**

https://projecteuclid.org/euclid.jap/1466172859

**Mathematical Reviews number (MathSciNet)**

MR3514283

**Zentralblatt MATH identifier**

06614114

**Subjects**

Primary: 54C70: Entropy 62H20: Measures of association (correlation, canonical correlation, etc.)

**Keywords**

Differential entropy Fisher information Gaussian copula Farlie–Gumbel–Morgenstern copula

#### Citation

Khoolenjani, Nayereh Bagheri; Alamatsaz, Mohammad Hossein. Extension of de Bruijn's identity to dependent non-Gaussian noise channels. J. Appl. Probab. 53 (2016), no. 2, 360--368. https://projecteuclid.org/euclid.jap/1466172859