Open Access
June, 1962 Mutual Information and Maximal Correlation as Measures of Dependence
C. B. Bell
Ann. Math. Statist. 33(2): 587-595 (June, 1962). DOI: 10.1214/aoms/1177704583

Abstract

Renyi [19] gives a set of seven postulates which a measure of dependence for a pair of random variables should satisfy. Of the dependence measures considered by Renyi, only Gebelein's [5] maximal correlation, $S_P$, satisfies all seven postulates. Kramer [10] in considering the uncertainty principle in Fourier analysis [11] generalizes the Gebelein maximal correlation to the case of arbitrary pairs of $\sigma$-algebras; and asks whether this generalization is equivalent to Shannon's mutual information, $C_P$, [4, 9, 21] for pairs of $\sigma$-algebras--equivalent in the sense of preserving order. The object of this note is to compare $S_P$ and the two normalizations $C'_P$ and $C"_P$, of $C_P$, as dependence measures for strictly positive probability spaces (which are necessarily generated by random variables). It is found that for such spaces with the proper finiteness restrictions (a) (Thm 5.1) $0 \leqq S_P, C'_P, C"_P \leqq 1$; (b) (Thm 5.2) $S_P = 0 \text{iff} C'_P = 0 \text{iff} C"_P = 0$ iff the random variables are independent; (c) (Thm 5.4) $S_P = 1$ if the two generated algebras have a nontrivial intersection (the conditions are equivalent for finite algebras); $C'_P = 1$ iff one of the random variables is a function of the other; and $C"_P = 1$ iff the random variables are functions of each other; and, consequently, (d) (Thm 5.5) there exist probability spaces for which the dependence measures are not equivalent. The paper is divided into six sections. Section 1 contains the introduction and summary. Section 2 introduces the terminology, notation and preliminaries. Section 3 treats $S_P$ and the Renyi postulates. In Section 4, the basic Shannon-Feinstein-Khinchin mutual information is extended to strictly positive measure spaces, not necessarily finite. The comparison of the dependence measures and postulate modifications are given in Section 5. Finally, in Section 6 some extensions and open problems are mentioned.

Citation

Download Citation

C. B. Bell. "Mutual Information and Maximal Correlation as Measures of Dependence." Ann. Math. Statist. 33 (2) 587 - 595, June, 1962. https://doi.org/10.1214/aoms/1177704583

Information

Published: June, 1962
First available in Project Euclid: 27 April 2007

zbMATH: 0212.51001
MathSciNet: MR148182
Digital Object Identifier: 10.1214/aoms/1177704583

Rights: Copyright © 1962 Institute of Mathematical Statistics

Vol.33 • No. 2 • June, 1962
Back to Top