## The Annals of Statistics

### Sparse recovery in convex hulls via entropy penalization

#### Abstract

Let (X, Y) be a random couple in S×T with unknown distribution P and (X1, Y1), …, (Xn, Yn) be i.i.d. copies of (X, Y). Denote Pn the empirical distribution of (X1, Y1), …, (Xn, Yn). Let h1, …, hN: S↦[−1, 1] be a dictionary that consists of N functions. For λ∈ℝN, denote fλ:=∑j=1Nλjhj. Let : T×ℝ↦ℝ be a given loss function and suppose it is convex with respect to the second variable. Let (f)(x, y):=(y; f(x)). Finally, let Λ⊂ℝN be the simplex of all probability distributions on {1, …, N}. Consider the following penalized empirical risk minimization problem $$\begin{eqnarray*}\hat{\lambda}^{\varepsilon}:={\mathop{\textrm{argmin}}_{\lambda\in \Lambda}}\Biggl[P_{n}(\ell \bullet f_{\lambda})+\varepsilon \sum_{j=1}^{N}\lambda_{j}\log \lambda_{j}\Biggr]\end{eqnarray*}$$ along with its distribution dependent version $$\begin{eqnarray*}\lambda^{\varepsilon}:={\mathop{\textrm{argmin}}_{\lambda\in \Lambda}}\Biggl[P(\ell \bullet f_{\lambda})+\varepsilon \sum_{j=1}^{N}\lambda_{j}\log \lambda_{j}\Biggr],\end{eqnarray*}$$ where ɛ≥0 is a regularization parameter. It is proved that the “approximate sparsity” of λɛ implies the “approximate sparsity” of λ̂ɛ and the impact of “sparsity” on bounding the excess risk of the empirical solution is explored. Similar results are also discussed in the case of entropy penalized density estimation.

#### Article information

Source
Ann. Statist., Volume 37, Number 3 (2009), 1332-1359.

Dates
First available in Project Euclid: 10 April 2009

https://projecteuclid.org/euclid.aos/1239369024

Digital Object Identifier
doi:10.1214/08-AOS621

Mathematical Reviews number (MathSciNet)
MR2509076

Zentralblatt MATH identifier
1269.62039

#### Citation

Koltchinskii, Vladimir. Sparse recovery in convex hulls via entropy penalization. Ann. Statist. 37 (2009), no. 3, 1332--1359. doi:10.1214/08-AOS621. https://projecteuclid.org/euclid.aos/1239369024