The Annals of Applied Probability

Central limit theorems for an Indian buffet model with random weights

Patrizia Berti, Irene Crimaldi, Luca Pratelli, and Pietro Rigo

Full-text: Open access


The three-parameter Indian buffet process is generalized. The possibly different role played by customers is taken into account by suitable (random) weights. Various limit theorems are also proved for such generalized Indian buffet process. Let $L_{n}$ be the number of dishes experimented by the first $n$ customers, and let $\overline{K}_{n}=(1/n)\sum_{i=1}^{n}K_{i}$ where $K_{i}$ is the number of dishes tried by customer $i$. The asymptotic distributions of $L_{n}$ and $\overline{K}_{n}$, suitably centered and scaled, are obtained. The convergence turns out to be stable (and not only in distribution). As a particular case, the results apply to the standard (i.e., nongeneralized) Indian buffet process.

Article information

Ann. Appl. Probab., Volume 25, Number 2 (2015), 523-547.

First available in Project Euclid: 19 February 2015

Permanent link to this document

Digital Object Identifier

Mathematical Reviews number (MathSciNet)

Zentralblatt MATH identifier

Primary: 60B10: Convergence of probability measures 60F05: Central limit and other weak theorems 60G09: Exchangeability 60G57: Random measures 62F15: Bayesian inference

Bayesian nonparametrics central limit theorem conditional identity in distribution Indian buffet process random measure random reinforcement stable convergence


Berti, Patrizia; Crimaldi, Irene; Pratelli, Luca; Rigo, Pietro. Central limit theorems for an Indian buffet model with random weights. Ann. Appl. Probab. 25 (2015), no. 2, 523--547. doi:10.1214/14-AAP1002.

Export citation


  • [1] Abramowits, M. and Stegun, I. A. (1972). Handbook of Mathematical Functions with Formulas, Graph, and Mathematical Tables. U.S. Department of Commerce. Washington, DC.
  • [2] Aletti, G., May, C. and Secchi, P. (2009). A central limit theorem, and related results, for a two-color randomly reinforced urn. Adv. in Appl. Probab. 41 829–844.
  • [3] Bassetti, F., Crimaldi, I. and Leisen, F. (2010). Conditionally identically distributed species sampling sequences. Adv. in Appl. Probab. 42 433–459.
  • [4] Berti, P., Crimaldi, I., Pratelli, L. and Rigo, P. (2009). Rate of convergence of predictive distributions for dependent data. Bernoulli 15 1351–1367.
  • [5] Berti, P., Crimaldi, I., Pratelli, L. and Rigo, P. (2011). A central limit theorem and its applications to multicolor randomly reinforced urns. J. Appl. Probab. 48 527–546.
  • [6] Berti, P., Pratelli, L. and Rigo, P. (2004). Limit theorems for a class of identically distributed random variables. Ann. Probab. 32 2029–2052.
  • [7] Bianconi, G. and Barabasi, A. L. (2001). Competition and multiscaling in evolving networks. Europhys. Lett. 54 436–442.
  • [8] Bianconi, G., Ferretti, L. and Franz, S. (2009). Non-neutral theory of biodiversity Europhys. Lett. 87 28001.
  • [9] Broderick, T., Jordan, M. I. and Pitman, J. (2012). Beta processes, stick-breaking and power laws. Bayesian Anal. 7 439–475.
  • [10] Çınlar, E. (2011). Probability and Stochastics. Graduate Texts in Mathematics 261. Springer, New York.
  • [11] Crimaldi, I., Letta, G. and Pratelli, L. (2007). A strong form of stable convergence. In Séminaire de Probabilités XL. Lecture Notes in Math. 1899 203–225. Springer, Berlin.
  • [12] Croft, W. A. (2006). The relevance of an evolutionary model to historical linguistics. In Competing Models of Linguistic Change: Evolution and Beyond (O. Nedergaard Thomsen, ed.) 91–132. Benjamins, Amsterdam.
  • [13] Doshi-Velez, F. and Ghahramani, Z. (2009). Correlated non-parametric latent feature models. In Uncertainty in Artificial Intelligence 25 143–150. AUAI Press, Montréal, QC.
  • [14] Gershman, S. J. and Blei, D. M. (2012). A tutorial on Bayesian nonparametric models. J. Math. Psych. 56 1–12.
  • [15] Gershman, S. J., Frazier, P. I. and Blei, D. M. (2012). Distance dependent infinite latent feature models. Available at arXiv:1110.5454v2 [stat.ML].
  • [16] Gorur, D., Jakel, F. and Rasmussen, C. E. (2006). A choice model with infinitely many latent features. In International Conference on Machine Learning, 23 361–368. Omni Press, New York.
  • [17] Griffiths, T. L. and Ghahramani, Z. (2006). Infinite latent feature models and the Indian buffet process. In Advances in Neural Information Processing Systems, 18 475–482. MIT Press, Cambridge, MA.
  • [18] Griffiths, T. L. and Ghahramani, Z. (2011). The Indian buffet process: An introduction and review. J. Mach. Learn. Res. 12 1185–1224.
  • [19] Hall, P. and Heyde, C. C. (1980). Martingale Limit Theory and Its Application: Probability and Mathematical Statistics. Academic Press, New York.
  • [20] Kingman, J. F. C. (1967). Completely random measures. Pacific J. Math. 21 59–78.
  • [21] Knowles, D. and Ghahramani, Z. (2011). Nonparametric Bayesian sparse factor models with application to gene expression modeling. Ann. Appl. Stat. 5 1534–1552.
  • [22] May, C. and Flournoy, N. (2009). Asymptotics in response-adaptive designs generated by a two-color, randomly reinforced urn. Ann. Statist. 37 1058–1078.
  • [23] Meeds, E., Ghahramani, Z., Neal, R. M. and Roweis, S. T. (2007). Modeling dyadic data with binary latent factors. In Advances in Neural Information Processing Systems, 19 977–984. MIT Press, Cambridge, MA.
  • [24] Miller, K. T., Griffiths, T. L. and Jordan, M. I. (2009). Nonparametric latent feature models for link prediction. In Advances in Neural Information Processing Systems 22: 23rd Annual Conference on Neural Information Processing Systems 2009. Proceedings of a Meeting Held 710 December 2009, Vancouver, British Columbia, Canada 1276–1284. Curran Associates, Red Hook, NY.
  • [25] Miller, K. T., Griffiths, T. L. and Jordan, M. I. (2008). The phylogenetic Indian buffet process: A non-exchangeable nonparametric prior for latent features. In Uncertainty in Artificial Intelligence 24 403–410. AUAI Press, Corvallis, OR.
  • [26] Navarro, D. J. and Griffiths, T. L. (2008). Latent features in similarity judgments: A nonparametric Bayesian approach. Neural Comput. 20 2597–2628.
  • [27] Pemantle, R. (2007). A survey of random processes with reinforcement. Probab. Surv. 4 1–79.
  • [28] Pitman, J. (2006). Combinatorial Stochastic Processes. Lecture Notes in Math. 1875. Springer, Berlin.
  • [29] Pitman, J. and Yor, M. (1997). The two-parameter Poisson–Dirichlet distribution derived from a stable subordinator. Ann. Probab. 25 855–900.
  • [30] Rice, S. H. (2004). Evolutionary Theory. Mathematical and Conceptual Foundations. Sinauer Associates, Sunderlang, MA.
  • [31] Sarkar, P., Chakrabarti, D. and Jordan, M. I. (2012). Nonparametric link prediction in dynamic networks. In Proceedings of the 29th International Conference on Machine Learning. Edinburgh.
  • [32] Teh, Y. W. and Göorüur, D. (2009). Indian buffet processes with power-law behaviour. In Advances in Neural Information Processing Systems, 22 1838–1846. MIT Press, Cambridge, MA.
  • [33] Thibaux, R. and Jordan, M. I. (2007). Hierarchical Beta processes and the Indian buffet process. In Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, AISTATS 2007, San Juan, Puerto Rico, March 2124, 2007 564–571. San Juan, Puerto Rico.
  • [34] Williamson, S., Orbanz, P. and Ghahramani, Z. (2010). Dependent Indian buffet processes. In International Conference on Artificial Intelligence and Statistics, 9 924–931.
  • [35] Wood, F., Griffiths, T. L. and Ghahramani, Z. (2006). A non-parametric Bayesian method for inferring hidden causes. In Uncertainty in Artificial Intelligence, 22 536–543. AUAI Press, Arlington, VA.