Open Access
June, 1985 Rates of Convergence of Minimum Distance Estimators and Kolmogorov's Entropy
Yannis G. Yatracos
Ann. Statist. 13(2): 768-774 (June, 1985). DOI: 10.1214/aos/1176349553

Abstract

Let $(\mathscr{X, A})$ be a space with a $\sigma$-field, $M = \{P_s; s \in \Theta\}$ be a family of probability measures on $\mathscr{A}$ with $\Theta$ arbitrary, $X_1, \cdots, X_n$ i.i.d. observations on $P_\theta.$ Define $\mu_n(A) = (1/n) \sum^n_{i = 1} I_A(X_i),$ the empirical measure indexed by $A \in \mathscr{A}.$ Assume $\Theta$ is totally bounded when metrized by the $L_1$ distance between measures. Robust minimum distance estimators $\hat{\theta}_n$ are constructed for $\theta$ and the resulting rate of convergence is shown naturally to depend on an entropy function for $\Theta$.

Citation

Download Citation

Yannis G. Yatracos. "Rates of Convergence of Minimum Distance Estimators and Kolmogorov's Entropy." Ann. Statist. 13 (2) 768 - 774, June, 1985. https://doi.org/10.1214/aos/1176349553

Information

Published: June, 1985
First available in Project Euclid: 12 April 2007

zbMATH: 0576.62057
MathSciNet: MR790571
Digital Object Identifier: 10.1214/aos/1176349553

Subjects:
Primary: 62G05
Secondary: 62G30

Keywords: Density estimation , Kolmogorov's entropy , minimum distance estimation , rates of convergence

Rights: Copyright © 1985 Institute of Mathematical Statistics

Vol.13 • No. 2 • June, 1985
Back to Top