Lipschitz-regularized gradient flows and generative particle algorithms for high-dimensional scarce data
DOI10.1137/23M1587841MaRDI QIDQ6664473
Hyemin Gu, Yannis Pantazis, L. Rey-Bellet, Markos A. Katsoulakis, P. Birmpa
Publication date: 16 January 2025
Published in: SIAM Journal on Mathematics of Data Science (Search for Journal in Brave)
gradient flowinformation theorydata integrationoptimal transportparticle algorithmsgenerative modeling
Computational learning theory (68Q32) Artificial neural networks and deep learning (68T07) Flows in porous media; filtration; seepage (76S05) Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10) Stochastic particle methods (65C35) Fokker-Planck equations (35Q84) Optimal transportation (49Q22)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Numerical study of a particle method for gradient flows
- Ordinary differential equations, transport theory and Sobolev spaces
- Exponential convergence of Langevin distributions and their discrete approximations
- Generalization of an inequality by Talagrand and links with the logarithmic Sobolev inequality
- Nonasymptotic convergence analysis for the unadjusted Langevin algorithm
- Barenblatt solutions and asymptotic behaviour for a nonlinear fractional heat equation of porous medium type
- \(L^q\)-functional inequalities and weighted porous media equations
- THE GEOMETRY OF DISSIPATIVE EVOLUTION EQUATIONS: THE POROUS MEDIUM EQUATION
- Principal component analysis: a review and recent developments
- The Variational Formulation of the Fokker--Planck Equation
- Lecture notes on the DiPerna–Lions theory in abstract measure spaces
- Scaling Limit of the Stein Variational Gradient Descent: The Mean Field Regime
- Fokker--Planck Particle Systems for Bayesian Inference: Computational Approaches
- Formulation and properties of a divergence used to compare probability measures without absolute continuity
- Optimizing Variational Representations of Divergences and Accelerating Their Statistical Estimation
- The elements of statistical learning. Data mining, inference, and prediction
This page was built for publication: Lipschitz-regularized gradient flows and generative particle algorithms for high-dimensional scarce data
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6664473)