Open Access
December, 1960 Normalizing the Noncentral $t$ and $F$ Distributions
Nico F. Laubscher
Ann. Math. Statist. 31(4): 1105-1112 (December, 1960). DOI: 10.1214/aoms/1177705682

Abstract

Let $X$ be a random variable governed by one of a family of distributions which is conveniently parameterized by $\mu$, the expectation of $X$, so that, in particular, the variance of $X, \sigma^2$, is a function of $\mu$, which we denote by $\sigma^2(\mu)$. A transformation, $\psi(X)$, is sometimes sought so that the variance of $\psi(X)$, as $\mu$ sweeps over its domain, is independent of $\mu$ (or much more nearly constant than $\sigma^2(\mu)$). A standard method of obtaining such a transformation for stabilization of the variance is to consider $X$ as one of a sequence of random variables, the sequence converging asymptotically in distribution, usually to a normal distribution. One form of the basic theorem is stated and proved by C. R. Rao [8], pp. 207-8, as follows. THEOREM (Rao). If $X$ is asymptotically normally distributed about $\mu$, with asymptotic variance $\sigma^2(\mu)$, then any function $\psi = \psi(X)$, with continuous first derivative in some neighborhood of $\mu$, is asymptotically normally distributed with mean $\psi(\mu)$ and variance $\sigma^2(\mu)(d\psi/d\mu)^2$, where $(d\psi/d\mu)$ denotes the derivative of $\psi(X)$ with respect to $X$, evaluated at the point $\mu$. From this we immediately have the following well-known COROLLARY. The random variable \begin{equation*}\tag{1}\psi(X) = c \int^X_K \frac{d\mu}{\sigma(\mu)}\end{equation*}, where $0 < x < \infty$, and where $K$ is an arbitrary constant, has a variance which is stabilized asymptotically at $c^2$ It is assumed, of course, that the integrand in (1) is integrable. If $\psi(X)$ is not a real-valued function on the domain of $X$, then the mapping is meaningless. Transformations such as (1), perhaps slightly modified, not only often work well for stabilizing non-asymptotic variances, but also often serve as well to normalize non-normal distributions. In general, however, nothing is known about the relative closeness to normality of the distribution of a random variable before and after a variance-stabilizing transformation is applied. Nor can anything general be said about the relative rapidity of approach to asymptotic normality. The study of concrete examples, however, suggests some connection between variance stabilization and normalization of non-normal distributions. A theoretical connection that may be relevant in certain cases has been put forward by N. L. Johnson [3], pp. 150-1. Johnson shows that, when the random variable of interest has a certain structure, then the differential equation for the normalizing transformation is similar to the differential equation for the variance-stabilizing transformation. The specified structure is that $X_n = Y_1 + Y_2G(X_1) + \cdots + Y_nG(X_{n-1})$, where the $Y'$s are independent and small, and $G(\cdot)$ is some function. In what follows, we obtain the variance-stabilizing transformation for the noncentral $t$ distributions and consider its normalizing properties. We repeat the same procedure for the topside noncentral $F$ distributions, although the variance-stabilizing transformation in this case is not well-defined. We then derive two other (well-defined) transformations for the approximate normalization of the topside noncentral F. Numerical comparisons of these approximations and the exact values are given.

Citation

Download Citation

Nico F. Laubscher. "Normalizing the Noncentral $t$ and $F$ Distributions." Ann. Math. Statist. 31 (4) 1105 - 1112, December, 1960. https://doi.org/10.1214/aoms/1177705682

Information

Published: December, 1960
First available in Project Euclid: 27 April 2007

zbMATH: 0104.13004
MathSciNet: MR117815
Digital Object Identifier: 10.1214/aoms/1177705682

Rights: Copyright © 1960 Institute of Mathematical Statistics

Vol.31 • No. 4 • December, 1960
Back to Top