Bernoulli

  • Bernoulli
  • Volume 25, Number 4B (2019), 3400-3420.

Rate of divergence of the nonparametric likelihood ratio test for Gaussian mixtures

Wenhua Jiang and Cun-Hui Zhang

Full-text: Access denied (no subscription detected)

We're sorry, but we are unable to provide you with the full text of this article because we are not able to identify you as a subscriber. If you have a personal subscription to this journal, then please login. If you are already logged in, then you may need to update your profile to register your subscription. Read more about accessing full-text

Abstract

We study a nonparametric likelihood ratio test (NPLRT) for Gaussian mixtures. It is based on the nonparametric maximum likelihood estimator in the context of demixing. The test concerns if a random sample is from the standard normal distribution. We consider mixing distributions of unbounded support for alternative hypothesis. We prove that the divergence rate of the NPLRT under the null is bounded by $\log n$, provided that the support range of the mixing distribution increases no faster than $(\log n/\log 9)^{1/2}$. We prove that the rate of $\sqrt{\log n}$ is a lower bound for the divergence rate if the support range increases no slower than the order of $\sqrt{\log n}$. Implications of the upper bound for the rate of divergence are discussed.

Article information

Source
Bernoulli, Volume 25, Number 4B (2019), 3400-3420.

Dates
Received: July 2017
Revised: November 2018
First available in Project Euclid: 25 September 2019

Permanent link to this document
https://projecteuclid.org/euclid.bj/1569398770

Digital Object Identifier
doi:10.3150/18-BEJ1094

Mathematical Reviews number (MathSciNet)
MR4010959

Zentralblatt MATH identifier
07110142

Keywords
Gaussian mixtures Hermite polynomials likelihood ratio test rate of divergence two-component mixtures

Citation

Jiang, Wenhua; Zhang, Cun-Hui. Rate of divergence of the nonparametric likelihood ratio test for Gaussian mixtures. Bernoulli 25 (2019), no. 4B, 3400--3420. doi:10.3150/18-BEJ1094. https://projecteuclid.org/euclid.bj/1569398770


Export citation

References

  • [1] Azaïs, J.-M., Gassiat, É. and Mercadier, C. (2006). Asymptotic distribution and local power of the log-likelihood ratio test for mixtures: Bounded and unbounded cases. Bernoulli 12 775–799.
  • [2] Azaïs, J.-M., Gassiat, É. and Mercadier, C. (2009). The likelihood ratio test for general mixture models with or without structural parameter. ESAIM Probab. Stat. 13 301–327.
  • [3] Bickel, P. and Chernoff, H. (1993). Asymptotic distribution of the likelihood ratio statistic in a prototypical non regular problem. In Statistics and Probability: A Raghu Raj Bahadur Festschrift (J.K. Ghosh, S.K. Mitra, K.R. Parthasarathy and B.L.S. Prakasa Rao, eds.) 83–96. New Delhi: Wiley Eastern.
  • [4] DasGupta, A. (2008). Asymptotic Theory of Statistics and Probability. Springer Texts in Statistics. New York: Springer.
  • [5] Donoho, D. and Jin, J. (2004). Higher criticism for detecting sparse heterogeneous mixtures. Ann. Statist. 32 962–994.
  • [6] Genovese, C.R. and Wasserman, L. (2000). Rates of convergence for the Gaussian mixture sieve. Ann. Statist. 28 1105–1127.
  • [7] Ghosal, S. and van der Vaart, A.W. (2001). Entropies and rates of convergence for maximum likelihood and Bayes estimation for mixtures of normal densities. Ann. Statist. 29 1233–1263.
  • [8] Gu, J., Koenker, R. and Volgushev, S. (2018). Testing for homogeneity in mixture models. Econometric Theory 34 850–895.
  • [9] Hartigan, J.A. (1985). A failure of likelihood asymptotics for normal mixtures. In Proceedings of the Berkeley Conference in Honor of Jerzy Neyman and Jack Kiefer, Vol. II (Berkeley, Calif., 1983). Wadsworth Statist./Probab. Ser. 807–810. Belmont, CA: Wadsworth.
  • [10] Ingster, Yu.I. (1999). Minimax detection of a signal for $l^{n}$-balls. Math. Methods Statist. 7 401–428.
  • [11] Jiang, W. and Zhang, C.-H. (2009). General maximum likelihood empirical Bayes estimation of normal means. Ann. Statist. 37 1647–1684.
  • [12] Jiang, W. and Zhang, C.-H. (2016). Generalized likelihood ratio test for normal mixtures. Statist. Sinica 26 955–978.
  • [13] Kiefer, J. and Wolfowitz, J. (1956). Consistency of the maximum likelihood estimator in the presence of infinitely many incidental parameters. Ann. Math. Stat. 27 887–906.
  • [14] Lindsay, B.G. (1995). Mixture Models: Theory, Geometry, and Applications. Hayward, CA: IMS.
  • [15] Liu, X. and Shao, Y. (2003). Asymptotics for likelihood ratio tests under loss of identifiability. Ann. Statist. 31 807–832.
  • [16] Liu, X. and Shao, Y. (2004). Asymptotics for the likelihood ratio test in a two-component normal mixture model. J. Statist. Plann. Inference 123 61–81.
  • [17] Robbins, H. (1950). A generalization of the method of maximum likelihood: Estimating a mixing distribution (abstract). Ann. Math. Stat. 21 314–315.
  • [18] van der Vaart, A.W. and Wellner, J.A. (1996). Weak Convergence and Empirical Processes: With Applications to Statistics. Springer Series in Statistics. New York: Springer.
  • [19] Zhang, C.-H. (2009). Generalized maximum likelihood estimation of normal mixture densities. Statist. Sinica 19 1297–1318.