November 2021 Scalable Monte Carlo inference and rescaled local asymptotic normality
Ning Ning, Edward L. Ionides, Ya’acov Ritov
Author Affiliations +
Bernoulli 27(4): 2532-2555 (November 2021). DOI: 10.3150/20-BEJ1321

Abstract

In this paper, we generalize the property of local asymptotic normality (LAN) to an enlarged neighborhood, under the name of rescaled local asymptotic normality (RLAN). We obtain sufficient conditions for a regular parametric model to satisfy RLAN. We show that RLAN supports the construction of a statistically efficient estimator which maximizes a cubic approximation to the log-likelihood on this enlarged neighborhood. In the context of Monte Carlo inference, we find that this maximum cubic likelihood estimator can maintain its statistical efficiency in the presence of asymptotically increasing Monte Carlo error in likelihood evaluation.

Acknowledgements

The authors would like to thank the anonymous reviewers, the Associate Editor, and the Editor-in-Chief for their constructive comments that greatly improved the quality of this paper. This research project was supported by NSF grant DMS-1761603.

Citation

Download Citation

Ning Ning. Edward L. Ionides. Ya’acov Ritov. "Scalable Monte Carlo inference and rescaled local asymptotic normality." Bernoulli 27 (4) 2532 - 2555, November 2021. https://doi.org/10.3150/20-BEJ1321

Information

Received: 1 July 2020; Revised: 1 November 2020; Published: November 2021
First available in Project Euclid: 24 August 2021

MathSciNet: MR4303894
zbMATH: 1504.62041
Digital Object Identifier: 10.3150/20-BEJ1321

Keywords: big data , local asymptotic normality , Monte Carlo , scalability

Rights: Copyright © 2021 ISI/BS

JOURNAL ARTICLE
24 PAGES

This article is only available to subscribers.
It is not available for individual sale.
+ SAVE TO MY LIBRARY

Vol.27 • No. 4 • November 2021
Back to Top