Moreau-Yosida $f$-divergences

From MaRDI portal
Publication:6361608

arXiv2102.13416MaRDI QIDQ6361608

Author name not available (Why is that?)

Publication date: 26 February 2021

Abstract: Variational representations of f-divergences are central to many machine learning algorithms, with Lipschitz constrained variants recently gaining attention. Inspired by this, we define the Moreau-Yosida approximation of f-divergences with respect to the Wasserstein-1 metric. The corresponding variational formulas provide a generalization of a number of recent results, novel special cases of interest and a relaxation of the hard Lipschitz constraint. Additionally, we prove that the so-called tight variational representation of f-divergences can be to be taken over the quotient space of Lipschitz functions, and give a characterization of functions achieving the supremum in the variational representation. On the practical side, we propose an algorithm to calculate the tight convex conjugate of f-divergences compatible with automatic differentiation frameworks. As an application of our results, we propose the Moreau-Yosida f-GAN, providing an implementation of the variational formulas for the Kullback-Leibler, reverse Kullback-Leibler, chi2, reverse chi2, squared Hellinger, Jensen-Shannon, Jeffreys, triangular discrimination and total variation divergences as GANs trained on CIFAR-10, leading to competitive results and a simple solution to the problem of uniqueness of the optimal critic.




Has companion code repository: https://github.com/renyi-ai/moreau-yosida-f-divergences

No records found.








This page was built for publication: Moreau-Yosida $f$-divergences

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6361608)