Open Access
December, 1991 Empirical Likelihood for Linear Models
Art Owen
Ann. Statist. 19(4): 1725-1747 (December, 1991). DOI: 10.1214/aos/1176348368


Empirical likelihood is a nonparametric method of inference. It has sampling properties similar to the bootstrap, but where the bootstrap uses resampling, it profiles a multinomial likelihood supported on the sample. Its properties in i.i.d. settings have been investigated in works by Owen, by Hall and by DiCiccio, Hall and Romano. This article extends the method to regression problems. Fixed and random regressors are considered, as are robust and heteroscedastic regressions. To make the extension, three variations on the original idea are considered. It is shown that when some functionals of the distribution of the data are known, one can get sharper inferences on other functionals by imposing the known values as constraints on the optimization. The result is first order equivalent to conditioning on a sample value of the known functional. The use of a Euclidean alternative to the likelihood function is investigated. A triangular array version of the empirical likelihood theorem is given. The one-way ANOVA and heteroscedastic regression models are considered in detail. An example is given in which inferences are drawn on the parameters of both the regression function and the conditional variance model.


Download Citation

Art Owen. "Empirical Likelihood for Linear Models." Ann. Statist. 19 (4) 1725 - 1747, December, 1991.


Published: December, 1991
First available in Project Euclid: 12 April 2007

zbMATH: 0799.62048
MathSciNet: MR1135146
Digital Object Identifier: 10.1214/aos/1176348368

Primary: 62E20

Keywords: bootstrap , Heteroscedasticity , jackknife , nonparametric likelihood , variance modeling

Rights: Copyright © 1991 Institute of Mathematical Statistics

Vol.19 • No. 4 • December, 1991
Back to Top