Journal of Applied Probability

Extension of de Bruijn's identity to dependent non-Gaussian noise channels

Nayereh Bagheri Khoolenjani and Mohammad Hossein Alamatsaz

Full-text: Access denied (no subscription detected)

We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text


De Bruijn's identity relates two important concepts in information theory: Fisher information and differential entropy. Unlike the common practice in the literature, in this paper we consider general additive non-Gaussian noise channels where more realistically, the input signal and additive noise are not independently distributed. It is shown that, for general dependent signal and noise, the first derivative of the differential entropy is directly related to the conditional mean estimate of the input. Then, by using Gaussian and Farlie–Gumbel–Morgenstern copulas, special versions of the result are given in the respective case of additive normally distributed noise. The previous result on independent Gaussian noise channels is included as a special case. Illustrative examples are also provided.

Article information

J. Appl. Probab., Volume 53, Number 2 (2016), 360-368.

First available in Project Euclid: 17 June 2016

Permanent link to this document

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Primary: 54C70: Entropy 62H20: Measures of association (correlation, canonical correlation, etc.)

Differential entropy Fisher information Gaussian copula Farlie–Gumbel–Morgenstern copula


Khoolenjani, Nayereh Bagheri; Alamatsaz, Mohammad Hossein. Extension of de Bruijn's identity to dependent non-Gaussian noise channels. J. Appl. Probab. 53 (2016), no. 2, 360--368.

Export citation


  • Cover, T. M. and Thomas, J. A. (2006). Elements of Information Theory, 2nd edn. John Wiley, Hoboken, NJ.
  • Gholizadeh, M. H., Amindavar, H. and Ritcey, J. A. (2013). Analytic Nakagami fading parameter estimation in dependent noise channel using copula. EURASIP J. Adv. Signal Process. 129, 10.1186/1687-6180-2013-129.
  • Guo, D., Shamai, S. and Verdu, S. (2005). Additive non-Gaussian noise channels: mutual information and conditional mean estimation. In Proc. IEEE Internat. Symp. Inf. Theory, IEEE, New York, pp. 719–723.
  • Joe, H. (1997). Multivariate Models and Dependence Concepts (Monogr. Statist. Appl. Prob. 73). Chapman & Hall, London.
  • Johnson, O. (2004). Information Theory and the Central Limit Theorem. Imperial College Press, London.
  • Johnson, O. (2013). A de Bruijn identity for symmetric stable laws. Preprint. Available at
  • Kay, S. (2009). Waveform design for multistatic radar detection. IEEE Trans. Aerospace Electron. Systems 45, 1153–1166.
  • Li, H. and Sun, Y. (2009). Tail dependence for heavy-tailed scale mixtures of multivariate distributions. J. Appl. Prob. 46, 925–937.
  • Moon, J. and Park, J. (2001). Pattern-dependent noise prediction in signal-dependent noise. IEEE J. Selected Areas Commun. 19, 730–743.
  • Nelsen, R. B. (2006). An Introduction to Copulas, 2nd edn. Springer, New York.
  • Park, S., Serpedin, E. and Qaraqe, K. (2012). On the equivalence between Stein and de Bruijn identities. IEEE Trans. Inf. Theory 58, 7045–7067.
  • Payaro, M. and Palomar, D. P. (2009). Hessian and concavity of mutual information, differential entropy, and entropy power in linear vector Gaussian channels. IEEE Trans. Inf. Theory 55, 3613–3628.
  • Pham, D.-T. (2005). Entropy of a variable slightly contaminated with another. IEEE Signal Process. Lett. 12, 536–539.
  • Rioul, O. (2011). Information theoretic proofs of entropy power inequalities. IEEE Trans. Inf. Theory 57, 33–55.
  • Sklar, M. (1959). Fonctions de répartition à $n$ dimensions et leurs marges. Publ. Inst. Statist. Univ. Paris 8, 229–231.
  • Stam, A. J. (1959). Some inequalities satisfied by the quantities of information of Fisher and Shannon. Information Control 2, 101–112.