The Annals of Statistics

Empirical Likelihood for Linear Models

Art Owen

Full-text: Open access


Empirical likelihood is a nonparametric method of inference. It has sampling properties similar to the bootstrap, but where the bootstrap uses resampling, it profiles a multinomial likelihood supported on the sample. Its properties in i.i.d. settings have been investigated in works by Owen, by Hall and by DiCiccio, Hall and Romano. This article extends the method to regression problems. Fixed and random regressors are considered, as are robust and heteroscedastic regressions. To make the extension, three variations on the original idea are considered. It is shown that when some functionals of the distribution of the data are known, one can get sharper inferences on other functionals by imposing the known values as constraints on the optimization. The result is first order equivalent to conditioning on a sample value of the known functional. The use of a Euclidean alternative to the likelihood function is investigated. A triangular array version of the empirical likelihood theorem is given. The one-way ANOVA and heteroscedastic regression models are considered in detail. An example is given in which inferences are drawn on the parameters of both the regression function and the conditional variance model.

Article information

Ann. Statist., Volume 19, Number 4 (1991), 1725-1747.

First available in Project Euclid: 12 April 2007

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier


Primary: 62E20: Asymptotic distribution theory

Bootstrap jackknife heteroscedasticity nonparametric likelihood variance modeling


Owen, Art. Empirical Likelihood for Linear Models. Ann. Statist. 19 (1991), no. 4, 1725--1747. doi:10.1214/aos/1176348368.

Export citation