Abstract
We analyze the following model: One person, called "helper" observes an outcome $x^n = (x_1, \cdots, x_n) \in \mathscr{X}^n$ of the sequence $X^n = (X_1, \cdots, X_n)$ of i.i.d. RV's and the statistician gets a sample $y^n = (y_1, \cdots, y_n)$ of the sequence $Y^n(\theta, x^n)$ of RV's with a density $\prod^n_{t = 1} f(y_t \mid \theta, x_t)$. The helper can give some (side) information about $x^n$ to the statistician via an encoding function $s_n: \mathscr{X}^n \rightarrow \mathbb{N}$ with rate($s_n)^{def}{=}(1/n)\log {\tt\#}$ range($s_n) \leq R$. Based on the knowledge of $s_n(x^n)$ and $y^n$ the statistician tries to estimate $\theta$ by an estimator $\hat{\theta}_n$. For the maximal mean square error $e_n(R) =^{def} \inf_{\hat\theta_n} \inf_{s_n: \text{rate}}(s_n) \leq R \sup_{\theta \in \Theta} E_\theta|\hat{\theta}_n - \theta|^2$ we establish a Cramer-Rao type bound and, in case of a finite $\mathscr{X}$, prove asymptotic achievability of this bound under certain conditions. The proof involves a nonobvious combination of results (some of which are novel) for both coding and estimation.
Citation
R. Ahlswede. M. V. Burnashev. "On Minimax Estimation in the Presence of Side Information About Remote Data." Ann. Statist. 18 (1) 141 - 171, March, 1990. https://doi.org/10.1214/aos/1176347496
Information