Scalable Computations of Wasserstein Barycenter via Input Convex Neural Networks
From MaRDI portal
Publication:6344712
arXiv2007.04462MaRDI QIDQ6344712
Yongxin Chen, Amirhossein Taghvaei, Jiaojiao Fan
Publication date: 8 July 2020
Abstract: Wasserstein Barycenter is a principled approach to represent the weighted mean of a given set of probability distributions, utilizing the geometry induced by optimal transport. In this work, we present a novel scalable algorithm to approximate the Wasserstein Barycenters aiming at high-dimensional applications in machine learning. Our proposed algorithm is based on the Kantorovich dual formulation of the Wasserstein-2 distance as well as a recent neural network architecture, input convex neural network, that is known to parametrize convex functions. The distinguishing features of our method are: i) it only requires samples from the marginal distributions; ii) unlike the existing approaches, it represents the Barycenter with a generative model and can thus generate infinite samples from the barycenter without querying the marginal distributions; iii) it works similar to Generative Adversarial Model in one marginal case. We demonstrate the efficacy of our algorithm by comparing it with the state-of-art methods in multiple experiments.
Has companion code repository: https://github.com/sbyebss/scalable-wasserstein-barycenter
Bayesian inference (62F15) Optimal transportation (49Q22) Statistical sampling theory and related topics (62Dxx)
This page was built for publication: Scalable Computations of Wasserstein Barycenter via Input Convex Neural Networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6344712)