Abstract
We provide nonasymptotic rates of convergence of the Wasserstein Generative Adversarial networks (WGAN) estimator. We build neural networks classes representing the generators and discriminators which yield a GAN that achieves the minimax optimal rate for estimating a certain probability measure μ with support in . The probability μ is considered to be the push forward of the Lebesgue measure on the d-dimensional torus by a map of smoothness . Measuring the error with the γ-Hölder Integral Probability Metric (IPM), we obtain up to logarithmic factors, the minimax optimal rate where n is the sample size, β determines the smoothness of the target measure μ, γ is the smoothness of the IPM ( is the Wasserstein case) and is the intrinsic dimension of μ. In the process, we derive a sharp interpolation inequality between Hölder IPMs. This novel result of theory of functions spaces generalizes classical interpolation inequalities to the case where the measures involved have densities on different manifolds.
Citation
Arthur Stéphanovitch. Eddie Aamari. Clément Levrard. "Wasserstein generative adversarial networks are minimax optimal distribution estimators." Ann. Statist. 52 (5) 2167 - 2193, October 2024. https://doi.org/10.1214/24-AOS2430
Information