Abstract
Using a simple application of Fubini's theorem, we examine the connection between statistical independence, linear independence of random vectors, and algebraic independence of univariate r.v.'s, where we call a finite set of r.v.'s algebraically independent if they satisfy a non-trivial polynomial relationship only with zero probability. As a consequence, we simplify the derivation of a result of Eaton and Perlman (1973) on the linear independence of random vectors, and settle a matrix equation question of Okamoto (1973) concerning the rank of sample covariance-type matrices $S = XAX'$, where $X$ is $p \times n$, and $A$ is $n \times n$, for the case $n \geq p \geq r = \operatorname{rank}(A)$. We also derive a measure-theoretic version of the classical fact that the elementary symmetric polynomials in $m$ indeterminates are algebraically independent. This has applications to sample moments, $k$-statistics, and $U$-statistics with polynomial kernels.
Citation
James D. Malley. "Statistical and Algebraic Independence." Ann. Statist. 11 (1) 341 - 345, March, 1983. https://doi.org/10.1214/aos/1176346086
Information