Open Access
2018 When fourth moments are enough
Chris Jennings-Shaffer, Dane R. Skinner, Edward C. Waymire
Rocky Mountain J. Math. 48(6): 1917-1924 (2018). DOI: 10.1216/RMJ-2018-48-6-1917

Abstract

This note concerns a somewhat innocent question motivated by an observation concerning the use of Chebyshev bounds on sample estimates of $p$ in the binomial distribution with parameters $n$, $p$, namely, what moment order produces the best Chebyshev estimate of $p$? If $S_n(p)$ has a binomial distribution with parameters $n$, $p$, then it is readily observed that ${argmax }_{0\le p\le 1}{\mathbb E}S_n^2(p) = {argmax }_{0\le p\le 1}np(1-p)= \frac{1}{2}$, and ${\mathbb E}S_n^2(\frac{1}{2}) = \frac{n}{4}$. Bhattacharya observed that, while the second moment Chebyshev sample size for a 95 percent confidence estimate within $\pm 5$ percentage points is $n = 2000$, the fourth moment yields the substantially reduced polling requirement of $n = 775$. Why stop at the fourth moment? Is the argmax achieved at $p = \frac{1}{2}$ for higher order moments, and, if so, does it help in computing $\mathbb {E}S_n^{2m}(\frac{1}{2})$? As captured by the title of this note, answers to these questions lead to a simple rule of thumb for the best choice of moments in terms of an effective sample size for Chebyshev concentration inequalities.

Citation

Download Citation

Chris Jennings-Shaffer. Dane R. Skinner. Edward C. Waymire. "When fourth moments are enough." Rocky Mountain J. Math. 48 (6) 1917 - 1924, 2018. https://doi.org/10.1216/RMJ-2018-48-6-1917

Information

Published: 2018
First available in Project Euclid: 24 November 2018

zbMATH: 06987232
MathSciNet: MR3879309
Digital Object Identifier: 10.1216/RMJ-2018-48-6-1917

Subjects:
Primary: 60A10 , 62D05

Keywords: Binomial distribution , Concentration inequalities , estimation , machine learning

Rights: Copyright © 2018 Rocky Mountain Mathematics Consortium

Vol.48 • No. 6 • 2018
Back to Top