Bernoulli

  • Bernoulli
  • Volume 22, Number 1 (2016), 325-344.

Sharp oracle inequalities and slope heuristic for specification probabilities estimation in discrete random fields

Matthieu Lerasle and Daniel Y. Takahashi

Full-text: Open access

Abstract

We study the problem of estimating the one-point specification probabilities in non-necessary finite discrete random fields from partially observed independent samples. Our procedures are based on model selection by minimization of a penalized empirical criterion. The selected estimators satisfy sharp oracle inequalities in $L_{2}$-risk.

We also obtain theoretical results on the slope heuristic for this problem, justifying the slope algorithm to calibrate the leading constant in the penalty. The practical performances of our methods are investigated in two simulation studies. We illustrate the usefulness of our approach by applying the methods to a multi-unit neuronal data from a rat hippocampus.

Article information

Source
Bernoulli, Volume 22, Number 1 (2016), 325-344.

Dates
Received: December 2011
Revised: June 2014
First available in Project Euclid: 30 September 2015

Permanent link to this document
https://projecteuclid.org/euclid.bj/1443620852

Digital Object Identifier
doi:10.3150/14-BEJ660

Mathematical Reviews number (MathSciNet)
MR3449785

Zentralblatt MATH identifier
1342.60077

Keywords
model selection penalization slope heuristic discrete random fields

Citation

Lerasle, Matthieu; Takahashi, Daniel Y. Sharp oracle inequalities and slope heuristic for specification probabilities estimation in discrete random fields. Bernoulli 22 (2016), no. 1, 325--344. doi:10.3150/14-BEJ660. https://projecteuclid.org/euclid.bj/1443620852


Export citation

References

  • [1] Arlot, S. and Bach, F. (2010). Data-driven calibration of linear estimators with minimal penalties. In Advances in Neural Information Processing Systems (NIPS) (Y. Bengio, D. Schuurmans, J.D. Lafferty, C.K.I. Williams and A. Culotta, eds.) 22 46–54. Available at http://papers.nips.cc/book/advances-in-neural-information-processing-systems-22-2009.
  • [2] Arlot, S. and Massart, P. (2009). Data-driven calibration of penalties for least-squares regression. J. Mach. Learn. Res. 10 245–279.
  • [3] Barron, A., Birgé, L. and Massart, P. (1999). Risk bounds for model selection via penalization. Probab. Theory Related Fields 113 301–413.
  • [4] Barron, A.R. and Sheu, C.-H. (1991). Approximation of density functions by sequences of exponential families. Ann. Statist. 19 1347–1369.
  • [5] Bento, J. and Montanari, A. (2009). Which graphical models are difficult to learn? Available at http://arxiv.org/pdf/0910.5761.
  • [6] Birgé, L. and Massart, P. (1997). From model selection to adaptive estimation. In Festschrift for Lucien Le Cam 55–87. New York: Springer.
  • [7] Birgé, L. and Massart, P. (2001). Gaussian model selection. J. Eur. Math. Soc. (JEMS) 3 203–268.
  • [8] Birgé, L. and Massart, P. (2007). Minimal penalties for Gaussian model selection. Probab. Theory Related Fields 138 33–73.
  • [9] Bousquet, O. (2002). A Bennett concentration inequality and its application to suprema of empirical processes. C. R. Math. Acad. Sci. Paris 334 495–500.
  • [10] Bresler, G., Mossel, E. and Sly, A. (2008). Reconstruction of Markov random fields from samples: Some observations and algorithms. In Approximation, Randomization and Combinatorial Optimization. Lecture Notes in Computer Science 5171 343–356. Berlin: Springer.
  • [11] Brown, E.N., Kass, R.E. and Mitra, P.P. (2004). Multiple neural spike train data analysis: State-of-the-art and future challenges. Nature Neuroscience 7 456–461.
  • [12] Csiszár, I. and Talata, Z. (2006). Consistent estimation of the basic neighborhood of Markov random fields. Ann. Statist. 34 123–145.
  • [13] Csiszár, I. and Talata, Z. (2006). Context tree estimation for not necessarily finite memory processes, via BIC and MDL. IEEE Trans. Inform. Theory 52 1007–1016.
  • [14] Galves, A., Orlandi, E. and Takahashi, D.Y. (2010). Identifying interacting pairs of sites in infinite range ising models. Preprint. Available at http://arxiv.org/abs/1006.0272.
  • [15] Georgii, H.-O. (1988). Gibbs Measures and Phase Transitions. de Gruyter Studies in Mathematics 9. Berlin: de Gruyter.
  • [16] Lerasle, M. (2011). Optimal model selection for density estimation of stationary data under various mixing conditions. Ann. Statist. 39 1852–1877.
  • [17] Lerasle, M. (2012). Optimal model selection in density estimation. Ann. Inst. Henri Poincaré Probab. Stat. 48 884–908.
  • [18] Lerasle, M. and Takahashi, D.Y. (2011). An oracle approach for interaction neighborhood estimation in random fields. Electron. J. Stat. 5 534–571.
  • [19] Lerasle M. and Takahashi D. Y. (2014). Supplement to “Sharp oracle inequalities and slope heuristic for specification probabilities estimation in discrete random fields.” DOI:10.3150/14-BEJ660SUPP.
  • [20] Massart, P. (2007). Concentration Inequalities and Model Selection. Lecture Notes in Math. 1896. Berlin: Springer. Lectures from the 33rd Summer School on Probability Theory held in Saint-Flour, July 6–23, 2003. With a foreword by Jean Picard.
  • [21] Pastalkova, E., Buzsáki, G., Mizuseki, K. and Sirota, A. Theta oscillations provide temporal windows for local circuit computation in the entorhinal-hippocampal loop. Neuron 64 267–280.
  • [22] Ravikumar, P., Wainwright, M.J. and Lafferty, J.D. (2010). High-dimensional Ising model selection using $\ell_{1}$-regularized logistic regression. Ann. Statist. 38 1287–1319.
  • [23] Saumard, A. (2013). The slope heuristics in heteroscedastic regression. Electron. J. Stat. 7 1184–1223.
  • [24] Schneidman, E., Berry, M.J., Segev, R. and Bialek, W. (2006). Weak pairwise correlations imply strongly correlated network states in a neural population. Nature 440 1007–1012.
  • [25] Takahashi, N., Sasaki, T., Matsumoto, W. and Ikegaya, Y. (2010). Circuit topology for synchronizing neurons in spontaneously active networks. Proc. Natl. Acad. Sci. USA 107 10244–10249.

Supplemental materials

  • Supplement to “Sharp oracle inequalities and slope heuristic for specification probabilities estimation in discrete random fields”. On this supplementary material available on-line, we prove the probabilistic tools needed in the proofs of the main results. The second part provides additional simulation results. The last one is devoted to the extension of all our results to the Küllback loss.