August 2024 Studentization Versus Variance Stabilization: A Simple Way Out of an Old Dilemma
Dimitris N. Politis
Statist. Sci. 39(3): 409-427 (August 2024). DOI: 10.1214/23-STS917

## Abstract

Assume ${\stackrel{ˆ}{\mathit{\theta }}}_{\mathit{n}}$ is a statistic used to estimate a parameter θ on the basis of data ${\mathit{X}}_{1},\dots ,{\mathit{X}}_{\mathit{n}}$. Further assume that ${\stackrel{ˆ}{\mathit{\theta }}}_{\mathit{n}}$ is consistent and asymptotically normal, with asymptotic variance given by ${\mathit{\sigma }}^{2}\left(\mathit{\theta }\right)$. Even if the functional form of ${\mathit{\sigma }}^{2}\left(·\right)$ is known, its dependence on the unknown parameter θ creates a dilemma as regards the construction of a confidence interval for θ. Should the interval be based on the normal quantiles with estimated variance, that is, studentization, or shall we transform the statistic ${\stackrel{ˆ}{\mathit{\theta }}}_{\mathit{n}}$ to ${\mathit{Y}}_{\mathit{n}}=\mathit{g}\left({\stackrel{ˆ}{\mathit{\theta }}}_{\mathit{n}}\right)$ such that the asymptotic variance of ${\mathit{Y}}_{\mathit{n}}$ does not depend on θ, that is, variance stabilization? We show how this dilemma can be bypassed by a straightforward construction that applies rather generally, and just hinges on solving simple algebraic equations. We illustrate the new approach on a host of numerical examples, including two examples in nonparametric function estimation. In the latter, a different sort of dilemma arises: employing undersmoothing versus an explicit bias correction. This paper is dedicated to the memory of Dr. Dimitrios Gatzouras (1962–2020).

## Funding Statement

This research was partially supported by NSF Grant DMS 19-14556.

## Acknowledgments

This paper is dedicated to the memory of Dr. Dimitrios Gatzouras, a brilliant mathematician and wonderful friend. His untimely passing in the Fall of 2020 has left a rift that is hard to fill personally but also professionally, as he had always been so generous with his time in trying to help others with their work, including an early version of the paper at hand. Sincere thanks are also due to Yunyi Zhang and Jiang Wang for carrying out the numerical work in Sections 5 and 6, respectively. Special acknowledgement is due to Yunyi Zhang who compiled new R functions to compute the required flat-top estimates of 2nd derivatives; these new functions are now included in the R package iosmooth(). Many thanks are also due to the Editor, Associate Editor and two reviewers for several constructive comments.

## Citation

Dimitris N. Politis. "Studentization Versus Variance Stabilization: A Simple Way Out of an Old Dilemma." Statist. Sci. 39 (3) 409 - 427, August 2024. https://doi.org/10.1214/23-STS917

## Information

Published: August 2024
First available in Project Euclid: 28 June 2024

Digital Object Identifier: 10.1214/23-STS917

Keywords: bias correction , confidence intervals , Edgeworth expansion , finite-sample coverage , Probability density estimation , undersmoothing

Rights: Copyright © 2024 Institute of Mathematical Statistics

JOURNAL ARTICLE
19 PAGES

Vol.39 • No. 3 • August 2024