Abstract
A vector variable $\mathbf{X}$ is said to have a linear structure if it can be written as $\mathbf{X} = \mathbf{AY}$ where $\mathbf{A}$ is a matrix and $\mathbf{Y}$ is a vector of independent random variables called structural variables. In earlier papers the conditions under which a vector random variable admits different structural representations have been studied. It is shown, among other results, that complete non-uniqueness, in some sense, of the linear structure characterizes a multivariate normal variable. In the present paper we prove a general decomposition theorem which states that any vector variable $\mathbf{X}$ with a linear structure can be expressed as the sum $(\mathbf{X}_1 + \mathbf{X}_2)$ of two independent vector variables $\mathbf{X}_1, \mathbf{X}_2$ of which $\mathbf{X}_1$ is non-normal and has a unique linear structure, and $\mathbf{X}_2$ is multivariate normal variable with a nonunique linear structure.
Citation
C. Radhakrishna Rao. "A Decomposition Theorem for Vector Variables with a Linear Structure." Ann. Math. Statist. 40 (5) 1845 - 1849, October, 1969. https://doi.org/10.1214/aoms/1177697400
Information