Lower Complexity Adaptation for Empirical Entropic Optimal Transport
From MaRDI portal
Publication:6441277
arXiv2306.13580MaRDI QIDQ6441277
Michel Groppe, Shayan Hundrieser
Publication date: 23 June 2023
Abstract: Entropic optimal transport (EOT) presents an effective and computationally viable alternative to unregularized optimal transport (OT), offering diverse applications for large-scale data analysis. In this work, we derive novel statistical bounds for empirical plug-in estimators of the EOT cost and show that their statistical performance in the entropy regularization parameter and the sample size only depends on the simpler of the two probability measures. For instance, under sufficiently smooth costs this yields the parametric rate with factor , where is the minimum dimension of the two population measures. This confirms that empirical EOT also adheres to the lower complexity adaptation principle, a hallmark feature only recently identified for unregularized OT. As a consequence of our theory, we show that the empirical entropic Gromov-Wasserstein distance and its unregularized version for measures on Euclidean spaces also obey this principle. Additionally, we comment on computational aspects and complement our findings with Monte Carlo simulations. Our techniques employ empirical process theory and rely on a dual formulation of EOT over a single function class. Crucial to our analysis is the observation that the entropic cost-transformation of a function class does not increase its uniform metric entropy by much.
Has companion code repository: https://gitlab.gwdg.de/michel.groppe/eot-lca-simulations
Asymptotic properties of nonparametric inference (62G20) Statistical aspects of big data and data science (62R07)
This page was built for publication: Lower Complexity Adaptation for Empirical Entropic Optimal Transport
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6441277)