Open Access
2020 Consistency and asymptotic normality of Latent Block Model estimators
Vincent Brault, Christine Keribin, Mahendra Mariadassou
Electron. J. Statist. 14(1): 1234-1268 (2020). DOI: 10.1214/20-EJS1695

Abstract

The Latent Block Model (LBM) is a model-based method to cluster simultaneously the $d$ columns and $n$ rows of a data matrix. Parameter estimation in LBM is a difficult and multifaceted problem. Although various estimation strategies have been proposed and are now well understood empirically, theoretical guarantees about their asymptotic behavior is rather sparse and most results are limited to the binary setting. We prove here theoretical guarantees in the valued settings. We show that under some mild conditions on the parameter space, and in an asymptotic regime where $\log (d)/n$ and $\log (n)/d$ tend to $0$ when $n$ and $d$ tend to infinity, (1) the maximum-likelihood estimate of the complete model (with known labels) is consistent and (2) the log-likelihood ratios are equivalent under the complete and observed (with unknown labels) models. This equivalence allows us to transfer the asymptotic consistency, and under mild conditions, asymptotic normality, to the maximum likelihood estimate under the observed model. Moreover, the variational estimator is also consistent and, under the same conditions, asymptotically normal.

Citation

Download Citation

Vincent Brault. Christine Keribin. Mahendra Mariadassou. "Consistency and asymptotic normality of Latent Block Model estimators." Electron. J. Statist. 14 (1) 1234 - 1268, 2020. https://doi.org/10.1214/20-EJS1695

Information

Received: 1 July 2019; Published: 2020
First available in Project Euclid: 24 March 2020

zbMATH: 07200228
MathSciNet: MR4079457
Digital Object Identifier: 10.1214/20-EJS1695

Keywords: asymptotic normality , concentration inequality , Latent Block Model , maximum likelihood estimate

Vol.14 • No. 1 • 2020
Back to Top