## Abstract

Let $n\ge 1$, $K>0$, and let $X=({X}_{1},{X}_{2},\dots ,{X}_{n})$ be a random vector in ${\mathbb{R}}^{n}$ with independent *K*–subgaussian components. We show that for every 1–Lipschitz convex function *f* in ${\mathbb{R}}^{n}$ (the Lipschitzness with respect to the Euclidean metric), $\forall t>0$,

$$max\left(\mathbb{P}\right\{f(X)-\mathrm{Med}\phantom{\rule{0.1667em}{0ex}}f(X)\ge t\},\mathbb{P}\{f(X)-\mathrm{Med}\phantom{\rule{0.1667em}{0ex}}f(X)\le -t\left\}\right)\le exp(-\frac{c\phantom{\rule{0.1667em}{0ex}}{t}^{2}}{{K}^{2}log(2+\frac{{K}^{2}n}{{t}^{2}})}),$$

where $c>0$ is a universal constant. The estimates are optimal in the sense that for every $n\ge \tilde{C}$ and $t>0$ there exist a product probability distribution *X* in ${\mathbb{R}}^{n}$ with *K*–subgaussian components, and a 1–Lipschitz convex function *f*, with

$$\mathbb{P}\left\{\right|f(X)-\mathrm{Med}\phantom{\rule{0.1667em}{0ex}}f(X)|\ge t\}\ge \tilde{c}\phantom{\rule{0.1667em}{0ex}}exp(-\frac{\tilde{C}\phantom{\rule{0.1667em}{0ex}}{t}^{2}}{{K}^{2}log(2+\frac{{K}^{2}n}{{t}^{2}})}).$$

The obtained deviation estimates for subgaussian variables are in sharp contrast with the case of variables with bounded $\Vert {X}_{i}{\Vert}_{{\mathit{\psi}}_{p}}$–quasi-norms for $p\in (0,2)$.

## Funding Statement

K.T. is partially supported by the Sloan Research Fellowship.

## Acknowledgments

We would like to express our gratitude to the anonymous reviewers whose valuable feedback and suggestions greatly improved the quality of this paper.

## Citation

Han Huang. Konstantin Tikhomirov. "On dimension-dependent concentration for convex Lipschitz functions in product spaces." Electron. J. Probab. 28 1 - 23, 2023. https://doi.org/10.1214/23-EJP944

## Information