Abstract
This paper shows that large nonparametric classes of conditional multivariate densities can be approximated in the Kullback–Leibler distance by different specifications of finite mixtures of normal regressions in which normal means and variances and mixing probabilities can depend on variables in the conditioning set (covariates). These models are a special case of models known as “mixtures of experts” in statistics and computer science literature. Flexible specifications include models in which only mixing probabilities, modeled by multinomial logit, depend on the covariates and, in the univariate case, models in which only means of the mixed normals depend flexibly on the covariates. Modeling the variance of the mixed normals by flexible functions of the covariates can weaken restrictions on the class of the approximable densities. Obtained results can be generalized to mixtures of general location scale densities. Rates of convergence and easy to interpret bounds are also obtained for different model specifications. These approximation results can be useful for proving consistency of Bayesian and maximum likelihood density estimators based on these models. The results also have interesting implications for applied researchers.
Citation
Andriy Norets. "Approximation of conditional densities by smooth mixtures of regressions." Ann. Statist. 38 (3) 1733 - 1766, June 2010. https://doi.org/10.1214/09-AOS765
Information