October 2024 Wasserstein generative adversarial networks are minimax optimal distribution estimators
Arthur Stéphanovitch, Eddie Aamari, Clément Levrard
Author Affiliations +
Ann. Statist. 52(5): 2167-2193 (October 2024). DOI: 10.1214/24-AOS2430

Abstract

We provide nonasymptotic rates of convergence of the Wasserstein Generative Adversarial networks (WGAN) estimator. We build neural networks classes representing the generators and discriminators which yield a GAN that achieves the minimax optimal rate for estimating a certain probability measure μ with support in Rp. The probability μ is considered to be the push forward of the Lebesgue measure on the d-dimensional torus Td by a map g:TdRp of smoothness β+1. Measuring the error with the γ-Hölder Integral Probability Metric (IPM), we obtain up to logarithmic factors, the minimax optimal rate O(nβ+γ2β+dn12) where n is the sample size, β determines the smoothness of the target measure μ, γ is the smoothness of the IPM (γ=1 is the Wasserstein case) and dp is the intrinsic dimension of μ. In the process, we derive a sharp interpolation inequality between Hölder IPMs. This novel result of theory of functions spaces generalizes classical interpolation inequalities to the case where the measures involved have densities on different manifolds.

Citation

Download Citation

Arthur Stéphanovitch. Eddie Aamari. Clément Levrard. "Wasserstein generative adversarial networks are minimax optimal distribution estimators." Ann. Statist. 52 (5) 2167 - 2193, October 2024. https://doi.org/10.1214/24-AOS2430

Information

Received: 1 November 2023; Revised: 1 April 2024; Published: October 2024
First available in Project Euclid: 20 November 2024

MathSciNet: MR4829484
zbMATH: 07961552
Digital Object Identifier: 10.1214/24-AOS2430

Subjects:
Primary: 62G05
Secondary: 62E17

Keywords: Distribution estimation , Generative model , interpolation inequality , Manifold , Minimax rate

Rights: Copyright © 2024 Institute of Mathematical Statistics

Vol.52 • No. 5 • October 2024
Back to Top