Bernoulli

  • Bernoulli
  • Volume 12, Number 5 (2006), 841-862.

Covariance chains

Nanny Wermuth, D.R. Cox, and Giovanni M. Marchetti

Full-text: Open access

Abstract

Covariance matrices which can be arranged in tridiagonal form are called covariance chains. They are used to clarify some issues of parameter equivalence and of independence equivalence for linear models in which a set of latent variables influences a set of observed variables. For this purpose, orthogonal decompositions for covariance chains are derived first in explicit form. Covariance chains are also contrasted to concentration chains, for which estimation is explicit and simple. For this purpose, maximum-likelihood equations are derived first for exponential families when some parameters satisfy zero value constraints. From these equations explicit estimates are obtained, which are asymptotically efficient, and they are applied to covariance chains. Simulation results confirm the satisfactory behaviour of the explicit covariance chain estimates also in moderate-sized samples.

Article information

Source
Bernoulli, Volume 12, Number 5 (2006), 841-862.

Dates
First available in Project Euclid: 23 October 2006

Permanent link to this document
https://projecteuclid.org/euclid.bj/1161614949

Digital Object Identifier
doi:10.3150/bj/1161614949

Mathematical Reviews number (MathSciNet)
MR2265345

Zentralblatt MATH identifier
1134.62031

Keywords
canonical parameters exponential families graphical chain models independence equivalence latent variables linear least-squares regressions moment parameters orthogonal decompositions parameter equivalence reduced models structural equation models

Citation

Wermuth, Nanny; Cox, D.R.; Marchetti, Giovanni M. Covariance chains. Bernoulli 12 (2006), no. 5, 841--862. doi:10.3150/bj/1161614949. https://projecteuclid.org/euclid.bj/1161614949


Export citation

References

  • [1] Aitchison, J. and Silvey, S.D. (1958) Maximum-likelihood estimation of parameters subject to restraints. Ann. Math. Statist, 29, 813-828.
  • [2] Anderson, T.W. (1958) An Introduction to Multivariate Statistical Analysis. New York: Wiley.
  • [3] Anderson, T.W. (1969) Statistical inference for covariance matrices with linear structure. In P.R. Krishnaiah (ed.), Multivariate Analysis II, pp. 55-66, New York: Academic Press.
  • [4] Anderson, T.W. (1973) Asymptotically efficient estimation of covariance matrices with linear structure. Ann. Statist., 1, 135-141.
  • [5] Anderson, T.W. and Olkin, I. (1986) Maximum-likelihood estimation of the parameters of a multivariate normal distribution. Linear Algebra Appl., 70, 147-171.
  • [6] Christensen, S. (1989) Statistical properties of I-projections within exponential families. Scand. J. Statist., 16, 307-318.
  • [7] Cochran, W.G. (1938) The omission or addition of an independent variate in multiple linear regression. J. Roy. Statist. Soc., Suppl., 5, 171-176.
  • [8] Cox, D.R. and Wermuth, N. (1990) An approximation to maximum-likelihood estimates in reduced models. Biometrika, 77, 747-761.
  • [9] Cox, D.R. and Wermuth, N. (1993) Linear dependencies represented by chain graphs (with discussion). Statist. Sci., 8, 204-218; 247-277.
  • [10] Cox, D.R. and Wermuth, N. (2000) On the generation of the chordless 4-cycle. Biometrika, 87, 206-212.
  • [11] Cramér, H. (1946) Mathematical Methods of Statistics. Princeton, NJ: Princeton University Press.
  • [12] Dempster, A.P. (1969) Elements of Continuous Multivariate Analysis. Reading, MA: Addison-Wesley.
  • [13] Dempster, A.P. (1972) Covariance selection. Biometrics, 28, 157-175.
  • [14] Drton, M. and Richardson, T.S. (2003) A new algorithm for maximum-likelihood estimation in Gaussian graphical models for marginal independence. In U. Kjærulff and C. Meek (eds), Uncertainty in Artificial Intelligence: Proceedings of the Nineteenth Conference, pp. 181-191.
  • [15] Drton, M. and Richardson, T.S. (2004) Multimodality of the likelihood in the bivariate seemingly unrelated regression model. Biometrika, 91, 383-392.
  • [16] Fisher, R.A. (1922) On the mathematical foundations of theoretical statistics. Philos. Trans. Roy. Soc. Lond. Ser. A, 222, 309-368.
  • [17] Frydenberg, M. (1990) The chain graph Markov property. Scand. J. Statist., 17, 333-353.
  • [18] Goldberger, A.S. (1964) Econometric Theory. New York: Wiley.
  • [19] Gram, J.P. (1883) Über die Entwicklung reeller Funktionen in Reihen mittelst der Methode der kleinsten Quadrate. J. Reine Angew. Math., 94, 41-73.
  • [20] Heywood, H.B. (1931) On finite sequences of real numbers. Proc. Roy. Soc. Lond. Ser. A, 134, 486-501.
  • [21] Isserlis, L. (1918) Formulae for determining the near values of products of deviations of mixed moment coefficients. Biometrika, 12, 183-184.
  • [22] Kauermann, G. (1996) On a dualization of graphical Gaussian models. Scand. J. Statist., 23, 105-116.
  • [23] Kiiveri, H.T. (1987) An incomplete data approach to the analysis of covariance structures. Psychometrika, 52, 539-554.
  • [24] Koster, J. (1999) On the validity of the Markov interpretation of path diagrams of Gaussian structural equation systems of simultaneous equations. Scand. J. Statist., 26, 413-431.
  • [25] Marchetti, G.M. (2006) Independencies induced from a graphical Markov model after marginalization and conditioning: the R package ggm. J. Statist. Software, 15(6).
  • [26] Markov, A.A. (1912) Wahrscheinlichkeitsrechnung (German translation of 2nd Russian edition). Leipzig: Teubner.
  • [27] McDonald, R.P. (2002) What can we learn from the path equations? Identifiability, constraints, equivalence. Psychometrika, 67, 225-249.
  • [28] McCullagh, P. (1987) Tensor Methods in Statistics. London: Chapman & Hall.
  • [29] Pearl, J. and Brito, C. (2002) A new identification condition for recursive models with correlated errors. Structural Equation Modeling, 9, 459-474.
  • [30] Pearl, J. and Wermuth, N. (1994) When can association graphs admit a causal interpretation? In P.
  • [31] Cheeseman and W. Oldford (eds.), Selecting Models from Data: Artificial Intelligence and Statistics IV, Lecture Notes in Statist. 89, pp. 205-214. New York: Springer-Verlag.
  • [32] Press, S.J. (1972) Applied Multivariate Analysis: Using Bayesian and Frequentist Methods of Inference. New York: Holt, Rinehart and Winston.
  • [33] Richardson, T.S. and Spirtes, P. (2002) Ancestral Markov graphical models. Ann. Statist., 30, 962-1030.
  • [34] Roverato, A. and Whittaker, J. (1998) The Isserlis matrix and its application to non-decomposable graphical models. Biometrika, 85, 711-725.
  • [35] Sargan, J.D. (1958) The estimation of economic relationships using instrumental variables. Econometrica, 26, 393-415.
  • [36] Schmidt, E. (1907) Zur Theorie der linearen und nichtlinearen Integralgleichungen. I. Teil: Entwicklungen willkürlicher Funktionen nach Systemen vorgeschriebener, Math. Ann., 63, 433-476.
  • [37] Speed, T.P. and Kiiveri, H.T. (1986) Gaussian Markov distributions over finite graphs. Ann. Statist., 14, 138-150.
  • [38] Stanghellini, E. and Wermuth, N. (2005) On the identification of path analysis models with one hidden variable. Biometrika, 92, 332-350.
  • [39] Wermuth, N. (1980) Linear recursive equations, covariance selection, and path analysis. J. Amer. Statist. Assoc., 75, 963-997.
  • [40] Wermuth, N. and Cox, D.R. (1998) On association models defined over independence graphs. Bernoulli, 4, 477-495.
  • [41] Wermuth, N. and Cox, D.R. (2004) Joint response graphs and separation induced by triangular systems. J. Roy. Statist. Soc. Ser. B, 66, 687-717.
  • [42] Wilks, S.S. (1932) Certain generalisations in the analysis of variance. Biometrika, 24, 471-494.