SVGD as a kernelized Wasserstein gradient flow of the chi-squared divergence
From MaRDI portal
Publication:6342055
arXiv2006.02509MaRDI QIDQ6342055
Author name not available (Why is that?)
Publication date: 3 June 2020
Abstract: Stein Variational Gradient Descent (SVGD), a popular sampling algorithm, is often described as the kernelized gradient flow for the Kullback-Leibler divergence in the geometry of optimal transport. We introduce a new perspective on SVGD that instead views SVGD as the (kernelized) gradient flow of the chi-squared divergence which, we show, exhibits a strong form of uniform exponential ergodicity under conditions as weak as a Poincar'e inequality. This perspective leads us to propose an alternative to SVGD, called Laplacian Adjusted Wasserstein Gradient Descent (LAWGD), that can be implemented from the spectral decomposition of the Laplacian operator associated with the target density. We show that LAWGD exhibits strong convergence guarantees and good practical performance.
Has companion code repository: https://github.com/MindSpore-scientific/code-11/tree/main/SVGD
This page was built for publication: SVGD as a kernelized Wasserstein gradient flow of the chi-squared divergence
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6342055)