Mean-Field Langevin Dynamics: Exponential Convergence and Annealing

From MaRDI portal
Publication:6389936

arXiv2202.01009MaRDI QIDQ6389936

Lénaïc Chizat

Publication date: 2 February 2022

Abstract: Noisy particle gradient descent (NPGD) is an algorithm to minimize convex functions over the space of measures that include an entropy term. In the many-particle limit, this algorithm is described by a Mean-Field Langevin dynamics - a generalization of the Langevin dynamics with a non-linear drift - which is our main object of study. Previous work have shown its convergence to the unique minimizer via non-quantitative arguments. We prove that this dynamics converges at an exponential rate, under the assumption that a certain family of Log-Sobolev inequalities holds. This assumption holds for instance for the minimization of the risk of certain two-layer neural networks, where NPGD is equivalent to standard noisy gradient descent. We also study the annealed dynamics, and show that for a noise decaying at a logarithmic rate, the dynamics converges in value to the global minimizer of the unregularized objective function.




Has companion code repository: https://github.com/lchizat/2022-mean-field-langevin-rate








This page was built for publication: Mean-Field Langevin Dynamics: Exponential Convergence and Annealing