Open Access
June 2009 Sparse recovery in convex hulls via entropy penalization
Vladimir Koltchinskii
Ann. Statist. 37(3): 1332-1359 (June 2009). DOI: 10.1214/08-AOS621


Let (X, Y) be a random couple in S×T with unknown distribution P and (X1, Y1), …, (Xn, Yn) be i.i.d. copies of (X, Y). Denote Pn the empirical distribution of (X1, Y1), …, (Xn, Yn). Let h1, …, hN: S↦[−1, 1] be a dictionary that consists of N functions. For λ∈ℝN, denote fλ:=∑j=1Nλjhj. Let : T×ℝ↦ℝ be a given loss function and suppose it is convex with respect to the second variable. Let (f)(x, y):=(y; f(x)). Finally, let Λ⊂ℝN be the simplex of all probability distributions on {1, …, N}. Consider the following penalized empirical risk minimization problem $$\begin{eqnarray*}\hat{\lambda}^{\varepsilon}:={\mathop{\textrm{argmin}}_{\lambda\in \Lambda}}\Biggl[P_{n}(\ell \bullet f_{\lambda})+\varepsilon \sum_{j=1}^{N}\lambda_{j}\log \lambda_{j}\Biggr]\end{eqnarray*} $$ along with its distribution dependent version $$\begin{eqnarray*}\lambda^{\varepsilon}:={\mathop{\textrm{argmin}}_{\lambda\in \Lambda}}\Biggl[P(\ell \bullet f_{\lambda})+\varepsilon \sum_{j=1}^{N}\lambda_{j}\log \lambda_{j}\Biggr],\end{eqnarray*}$$ where ɛ≥0 is a regularization parameter. It is proved that the “approximate sparsity” of λɛ implies the “approximate sparsity” of λ̂ɛ and the impact of “sparsity” on bounding the excess risk of the empirical solution is explored. Similar results are also discussed in the case of entropy penalized density estimation.


Download Citation

Vladimir Koltchinskii. "Sparse recovery in convex hulls via entropy penalization." Ann. Statist. 37 (3) 1332 - 1359, June 2009.


Published: June 2009
First available in Project Euclid: 10 April 2009

zbMATH: 1269.62039
MathSciNet: MR2509076
Digital Object Identifier: 10.1214/08-AOS621

Primary: 62G07 , 62G08 , 62H30

Keywords: convex hulls , Entropy , Penalized empirical risk minimization , Sparsity

Rights: Copyright © 2009 Institute of Mathematical Statistics

Vol.37 • No. 3 • June 2009
Back to Top